RealClimate

Comments

RSS feed for comments on this post.

  1. Gilbert’s figure of 2.5 thought, I think, a very good starting point.

    The question being asked is “how does greenhouse gasses affect the earths temperature?”. And this one answers “If you consider only CO2 (which we are responsible for), 2.5 degrees per doubling.”

    Now if someone wants to gainsay that, they are no longer asking the question, are they. They are asking for an answer that they like.

    Now you can just tell them “well, you do the maths then and we’ll look over the result, but by default, this is what would happen and you must take in any changes that occur whether they increase sensitivity or reduce it”. Or just ignore them and let them know that if they wanted a different answer, they should have asked a different question.

    Comment by Mark — 8 Sep 2008 @ 7:13 AM

  2. I’m trying to do some laymen-level research on various change models predicting climate winners and losers in North America, particularly related to areas to retire to in, say, 20 years. Where do you think I could find some understandable middle of the road graphics, etc?

    Comment by Rick Kennerly — 8 Sep 2008 @ 7:36 AM

  3. Thanks! This is a great (and eloquent) description of the incrementing complexity required of models to go from the interactions between atoms and photons to those between Earth system components.

    As someone who’s used to working with complex (= simple; compared to reality!) models of biogeochemistry, I’ve gotten so used to requiring simulations to explore behaviour that I no longer give it a second thought. This has given me an institutionalised mindset that, at times, blinkers my understanding of why others misunderstand. This article is a helpful, if obvious, reminder that not everyone else is in the same boat! ;-)

    Best regards,

    Andrew.

    Comment by Andrew — 8 Sep 2008 @ 7:46 AM

  4. #3 Andrew: I’m trying to figure out whether your question is off-topic or not! To get reliable predictions for climate on regions as small as a US state will take more computing power and understanding of climate processes than we have at present. So far, experts are pretty sure that the Southwest from Calif. to Texas will have worse heat waves and drought (=forest fires), which may have started already. This is pretty robust; people have been predicting it for decades. Most other US regions aside from Alaska don’t seem too sensitive to near-term climate change… ironically, the country most responsible for the greenhouse gases in the atmosphere is going to feel the effects later than many innocent countries. We do expect everywhere the generalities of climate change: hotter summer nights and later frost, more variability such as more intense droughts and rainstorms. And don’t get beachfront property near high tide if you are looking more than a couple of decades ahead. All that was predictable from simple hand-waving considerations, and indeed has been predicted for half a century.

    If you have access to Science magazine, see the report Aug. 15 p. 909 of an article on this in press in Geophysical Research Letters, with a nice graphic.

    Comment by Spencer — 8 Sep 2008 @ 8:19 AM

  5. The degree of complexity takes the issue into a region where people not intimate with the complexity are simply outsiders. Since there are consequences to the issue that are the opposite of “academic”, those who want to stir the world to action are on the horns of a dilemma. What they want in the way of response and what’s possible for them to communicate to the layman are at such a remove that it will be a miracle if anything actually happens to stem climate change. In an ideal world, “Trust us” coupled with a well-organized primer (like the IPCC reports) would be sufficient, but as we all know and as Fate would have it, there are Other Interests involved. And as Upton Sinclair heartrendingly pointed out, “It’s hard to get a man to understand something when his paycheck depends upon him not understanding it.” And there are lots and lots of paychecks involved.

    Comment by Jeffrey Davis — 8 Sep 2008 @ 8:23 AM

  6. As an engineer, and one that has worked his entire career on issues involving energy and the environment, and most specifically on air/atmospheric issues, I take some exception to the characterization of “engineers.”

    While I realized long ago that simplistic equations and approaches can get you nonsense as an answer, I now tell people who attempt this that only when they’ve gone through the process line by line, layer by layer will they have some sense of what is involved. As noted, many will simply give up realizing that they have not the scientific prowess or time to go through the exercise.

    That said, I do often point out the most straight forward of the simple laws that govern the processes involved and point out that the theory does not rely on correlations of measurements to create GHG theory. Rather, I point out that the correlations support the underlying physics and are not just some random correlations without some underpinning science to them. One need only go to such sites as http://www.venganza.org/about/open-letter/ and look at the relationship between global temperature and the number of pirates.

    As an engineer attempting to deal with the consequences, I like to know what future I am planning for.

    Comment by Gsaun039 — 8 Sep 2008 @ 8:28 AM

  7. Thanks, Spencer, for a very lucid explanation. Most senior engineers, however, should understand this level of complexity and the confidence one must have in solutions obtained by numerical integration using digital computers. Otherwise, we never could have launched a single missile or space vehicle.

    Comment by Ron Taylor — 8 Sep 2008 @ 8:33 AM

  8. “Physics is rich in phenomena that are simple in appearance but cannot be calculated in simple terms. Global warming is like that. People may yearn for a short, clear way to predict how much warming we are likely to face. Alas, no such simple calculation exists. The actual temperature rise is an emergent property resulting from interactions among hundreds of factors. People who refuse to acknowledge that complexity should not be surprised when their demands for an easy calculation go unanswered.”

    Ah! But then people who offer “simple” solutions to this complex problem shouldn’t be so surprised when they are asked to provide the simple proof, should they? It is a double edged sword, this complexity. If you are asking for change, shouldn’t you be required to accurately predict what your change will produce? But apparently that isn’t possible.

    Comment by Michael — 8 Sep 2008 @ 8:33 AM

  9. > If you are asking for change, shouldn’t you be required to accurately predict
    > what your change will produce?

    You’ve stated the precautionary principle. But as Dr. Weart points out, accurate prediction was available.

    Comment by Hank Roberts — 8 Sep 2008 @ 9:47 AM

  10. Would it be possible to have an actual senior engineer present their (presumably, mainstream) views of anthropogenic climate change and of the use of models?

    As a (senior? and scientifically trained) engineer myself, I can guess what Mr Weart is aiming at, but he’s still using a language that brings down no barrier. For example, a statement such as

    “Gilbert N. Plass used the data and computers to demonstrate that adding carbon dioxide to a column of air would raise the surface temperature”

    will and does definitely make people suspicious.

    You see, I have seen dozens, and I am sure there are out there hundreds of thousands of designs that have been “demonstrated” in a computer only to fail miserably when put into practice.

    In fact, one point that I don’t think Mr Weart realizes (and likely, it’s all part of the miscommunication) is that it’s the engineers that have to deal with the actual world out there, and all its complexity, starting from but having to go beyond what calculations (formulae and/or models) suggest.

    It really is the job of engineers to understand the complexity of the real world, and to make things work within that complexity.

    There is little point in arguing to your manager that, say, in the computer your revolutionary design of a car needs only 2 gallons per 100 miles, when the actual thing is measured as drinking much more than that.

    The one rule common to all engineered system is, the more stuff you put in, the higher the chances that something will go wrong. In Mr Weart’s case: the more factors need to be made to interact using models and supercomputers to calculate “global warming”, the higher the chance that the computed answer won’t be the right one.

    Therefore, rather than accusing engineers of looking for simple answers (likely, misunderstanding them), Mr Weart should try to bridge the gap.

    An example of another scientific endeavor, apart from climate change, where extremely complex, just-made modelling has been successfully applied as-is into an engineering project, would definitely be a good starting point.

    Comment by Maurizio Morabito — 8 Sep 2008 @ 9:50 AM

  11. http://www.timesonline.co.uk/tol/news/environment/article4690900.ece

    Comment by Hank Roberts — 8 Sep 2008 @ 9:54 AM

  12. For those who want a “simple” place to start, the work of the 1920′s on understanding the structures of stellar interiors is worth review. A number of approximations, many developed by Eddington, made calculation tractable with the methods of those days, often a room full of woman calculators. Much more detailed opacities are used now and the atmospheres of stars above their photospheres also get detailed treatment. It turns out that the physics is not controversial, just complicated. It is a fun study, but give it a few months to start to get a grip on it.

    Comment by Chris Dudley — 8 Sep 2008 @ 10:04 AM

  13. Michael (#8): Change of some kind or other is going to happen– it always does; it’s not a choice between ‘keeping things the same’ and ‘changing them’. Why do those who advocate a reduction in our GHG emissions owe you a ‘simple’ proof of what the consequences of their proposal would be, while you don’t owe them a simple proof that continuing our GHG emissions won’t have disastrous consequences? Are you foolish enough to just say, ‘well, there’s no disaster yet’, so maybe it’s OK? If so, consider this old joke: a man falling from the top of the Empire State building falls past the 20th floor and someone yells at him, ‘how are you’– he answers, ‘alright so far’… The evidence that serious problems are already developing, and that much worse are yet to come, is very strong.

    Comment by Bryson Brown — 8 Sep 2008 @ 10:15 AM

  14. Michael (#8), just because you can’t predict something in a page doesn’t mean you can’t “accurately predict” it.

    Consider an example from pure mathematics: Most results that are at all deep haven’t (and likely can’t) be proven in one page or ten or in many cases even a hundred. But that doesn’t undermine the validity of the proof, say, of the solvability of groups of odd order (a famous example from 1963 of a very simple, very important result that took about 250 pages of very dense reasoning and hasn’t been materially condensed in the four decades since then).

    Comment by crust — 8 Sep 2008 @ 10:21 AM

  15. Michael #8: It would appear that we have what is actually a pretty simple argument here, though based on complex diagnostics: The patient says, “Doctor, Doctor, it hurts when I raise CO2 levels.”
    To which the Doctor riplies: “Don’t do that.”

    Once we have established a causal link between CO2 and climate–and by any reasonable standard, we have–if we remove the cause, we avoid the effect. It would be much more difficult to justify any sort of geoengineering aproach without a huge level of effort by your standards, but reducing CO2 emissions is a no-miss solution.

    Comment by Ray Ladbury — 8 Sep 2008 @ 10:27 AM

  16. Reading the paper I came to think of an experience I had, a (large) number of years ago, with what then was called micro simulation. A number of social scientists attached probabilities to each individual in a region as to for instance their demographic and economic behaviour and made assumptions as to how they all would react to the behaviour of all the others. (Obviously very complicated calculations, far beyond the grasp of a simple central bureaucrat = me.) What made me suspicious (and here we have my link to the present paper) was that the scientists said that they after each computation checked the aggregate results against macro data on the region in question and when they got “monstrous” and “unrealistic” descriptions of the future region they changed their assumptions untill the result was “reasonable”. This “method” made me very doubtful. Do you see the parallel?

    Comment by Gösta Oscarsson — 8 Sep 2008 @ 10:28 AM

  17. Mauritzio, #10.

    OK, so please furnish us with an Earth Mk II so we can avoid having to use a mathematical simulation.

    Or do you have any explanation as to WHY the simulations are incorrect?

    Or are you just looking for an answer you’re happy with?

    Comment by Mark — 8 Sep 2008 @ 10:28 AM

  18. Great post! Does anyone know where can one obtain a fairly detailed laypersons description of modern bio-geo-chemo-climate models anywhere that lists all the positive and negative feedbacks?

    Comment by Doug H — 8 Sep 2008 @ 10:30 AM

  19. If you are asking for change, shouldn’t you be required to accurately predict what your change will produce? But apparently that isn’t possible.

    In this country, we’re terribly familiar with acting upon insufficient information about a potential catastrophe. Our current bill for that “1% possibility” will reach more than $1 trillion dollars before the last veteran maimed by the war is laid to rest.

    Compare the transparency of the science with the political shenanigans which prefaced the war. Yes, there’s complexity with the science of AGW, but complexity is a “mote” compared with the “beam” of the 3 Card Monty game involved in manipulating intelligence.

    Much of the change encouraged by scientists ought to dovetail nicely with the necessity brought about by Peak Oil and the good involved in extricating our political choices from the swamp of Mid Eastern grievances. The solution to all these things looks remarkably similar: burn a lot less carbon.

    Comment by Jeffrey Davis — 8 Sep 2008 @ 10:51 AM

  20. Maurizio Morabito, I would not attach great importance to Spencer’s use of “engineer” as questioner. He is merely taking an engineer as an example of an educated professional not acquainted with details of climate science. As indicated by debacles in the APS Forum on Physics and Society and elsewhere, one could equally take “physicist,” “statistician,” or even “meteorologist.”
    The point stands that without concerted effort to understand details of the science, even an educated person will come to incorrect conclusions. Nobody is saying that the models reproduce reality, but rather that they give us insights into how the real world is working as long as we have prepared ourselves by decades of study.

    Comment by Ray Ladbury — 8 Sep 2008 @ 11:02 AM

  21. “You don’t need a weatherman to know which way the wind blows”

    Is there a similar phrase for climatology?

    Maybe, ‘You don’t need a climatologist to tell which way the heating goes’

    Doesn’t quite have the rhythm and poetry of Bob Dylan. We all want elegant and simple statements that reveal truth about change.

    Comment by Richard Pauli — 8 Sep 2008 @ 11:06 AM

  22. #4: Oops. I don’t think I asked a question. I was just shamelessly offering praise. ;-)

    I thought that your piece was a very clear articulation of why one can’t simply jump from a handful of first-order equations to a prediction of the consequences of a doubling of CO2. Aside from the plethora of processes that even complicated climate models exclude, there’s a big chunk of simulation necessary to tease out the all of the emergent properties necessary for answering even seemingly simple climate questions.

    Anyway, great article.

    Andrew.

    Comment by Andrew — 8 Sep 2008 @ 11:20 AM

  23. I’d like to see the models take into account Hurricanes. These are vast dissapative systems. But they may be the least of them.

    Comment by austin — 8 Sep 2008 @ 11:45 AM

  24. Following up on #11. If modeling problems are so complex that they defy closed-form solution, a sound engineering response, in my experience, is to establish sufficient correlation of the model to reality before using it to project new scenarios. If it is not capable of predicting reality, the usual request is to go look for the missing variables or relationships until it does correlate sufficiently. 3 questions then:

    (1) How well have climate models of 10-25 years ago predicted climate-scale change over the last 10-25 years?

    (2) Do climate models contain the physical models/factors that recently resulted in the toggle of the PDO and resulting projections by some that we might see a delay in global warming for 10 years or so? And if so, why did they not predict it?

    (3) If these factors are not present, what efforts are ongoing to address this?

    Can you point to any papers or discussions on this topic if the answer is too involved to give here?

    Comment by Douglas — 8 Sep 2008 @ 12:02 PM

  25. A question about the basic physics explanation:

    We understand the basic physics just fine, and can explain it in a minute to a curious non-scientist. (Like this: greenhouse gases let sunlight through to the Earth’s surface, which gets warm; the surface sends infrared radiation back up, which is absorbed by the gases at various levels and warms up the air; the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases.)

    Earth is in radiative equilibrium, at least over the long term. In a basic-physics explanation, it is better to describe the present warming trend as a transient response (as the system seeks out a new equilibrium at higher temperature) or a steady-state response (as a new value of opacity enters into the radiative balance)?

    George

    (forgive me if this is a duplicate posting – I’m having trouble with the captcha)

    Comment by George Musser — 8 Sep 2008 @ 12:10 PM

  26. Spencer Weart, it’s a good question: how can you calculate global warming mathematically?

    Here’s my contribution: a link to an essay that I published recently for nonscientists, “The Case for Modern Anthropogenic Global Warming”. It’s a reply to the contrarian arguments advanced by Alexander Cockburn in 2007 on the CounterPunch website and in the pages of The Nation.

    http://monthlyreview.org/080728farley.php

    Because the essay was written for a nontechnical audience, all equations have been banished to endnotes.

    In this essay, endnotes 12 and 15 give some simple calculations.

    It IS possible to calculate, using the Stefan-Boltzmann equation, the temperature of a hypothetical Earth with no greenhouse gases in the atmosphere. It’s below the freezing point of water. This shows that the greenhouse effect is a BIG effect.

    People are often surprised that global warming, caused by the enhanced greenhouse effect, is large enough to cause a problem, even though carbon dioxide is only present in the atmosphere at a level of hundreds of parts per million. Instead, the real surprise is that global warming is a small as it is!

    -John Farley
    Professor of Physics, UNLV

    Comment by John Farley — 8 Sep 2008 @ 12:12 PM

  27. Re Michael @8: “If you are asking for change, shouldn’t you be required to accurately predict what your change will produce? But apparently that isn’t possible.”

    And that impossibility is what makes it so risky: we know in general what will happen, but we don’t know exactly what will happen, which is why we can’t rule out the worst case scenario. Sounds like precisely a time to invoke the precautionary principle, no?

    Comment by Jim Eager — 8 Sep 2008 @ 1:40 PM

  28. Since the GHG sensitivity estimates are completely based on the results of climate models, a method of testing of how accurate the computer models are must be undertaken.

    There is a clear (moral, social, economic and scientific) responsibility to do so given what is at stake.

    [Response: That isn't the case (sensitivity estimates are also made via paleo-climate and modern changes), and secondly, that testing is already being undertaken (read the IPCC AR4 report for an assessment of that work. - gavin]

    Comment by John Lang — 8 Sep 2008 @ 1:44 PM

  29. as far as simple a proof goes, not being a scientist myself i’m pretty happy with ockham’s razor (though final year of high school physics is also pretty useful). i find it entirely plausible that if you pump 37% more of a potent and important GHG into the atmosphere, then you can expect it to have an impact.

    as for engineers, i work with them regularly (civil and mechanical mainly) and they reliably and usefully design structures with a safety margin built in. so, if a beam will be reasonably expected to carry X load, it will actually be designed to carry say a 10X load. this is basically the “just to be on the safe side” factor, because complex systems are necessarily difficult to predict.

    the important and entirely reasonable thing here though is that they design things on the safe side because the consequences of not doing so could be devastating.

    and this is only for one beam on one floor of one building, we’re talking about the very planet that sustains us.

    i’m not particularly worried about super accurate predictions, the stakes are just so high and human behaviour and the climate system so complex, i’d rather play it safe with my future (and my children’s, your’s, your children’s, etc, etc).

    the quest for super accurate predictions actually reminds me of a Borges story where a people are so obsessed with making the perfect map they end up with a map as big as the land it maps. we can wait for perfectly accurate predictions but they will be useless because they will only ever be perfectly accurate *after* the event has been observed.

    Comment by anna — 8 Sep 2008 @ 1:53 PM

  30. Jeffrey Davis says: “Compare the transparency of the science with the political shenanigans which prefaced the war.”

    Oh, let’s not, shall we? I’d really like to hold science to a higher standard than politics–particularly the politics of a kleptocratic regime.

    [Response: Agreed. The war is OT. - gavin]

    Comment by Ray Ladbury — 8 Sep 2008 @ 2:01 PM

  31. Gösta Oscarsson says: “What made me suspicious (and here we have my link to the present paper) was that the scientists said that they after each computation checked the aggregate results against macro data on the region in question and when they got “monstrous” and “unrealistic” descriptions of the future region they changed their assumptions untill the result was “reasonable”. This “method” made me very doubtful. Do you see the parallel?”

    Actually, not at all. Big difference between this and the dynamical approach used for climate modeling.

    Comment by Ray Ladbury — 8 Sep 2008 @ 2:11 PM

  32. Re John Farley @26, your Monthly Review critique of Cockburn’s arguments is a thorough and valuable resource, but I fear your rebuttal of Contrarian Claim 3, the lag of CO2 increase behind temperature rise at the end of an ice age, has a serious deficiency in that it does not describe the subsequent role of CO2 as a feedback in adding more warmth, and thus the ability of CO2 to act as both a feedback in the case of deglaciation, and as a forcing when added directly to the atmosphere as at present.

    Comment by Jim Eager — 8 Sep 2008 @ 2:13 PM

  33. John Lang Says:
    8 September 2008 at 1:44 PM

    ” Since the GHG sensitivity estimates are completely based on …” and then states his misunderstanding.

    John, where did you get this mistaken idea? Did you read it somewhere? Could be you’re misreading a good source, or you’re being fooled by fake “facts” — ??

    Comment by Hank Roberts — 8 Sep 2008 @ 2:24 PM

  34. “These people, typically senior engineers, get suspicious”.

    Please, can we get deeper than “senior engineers” – that really isn’t improving insight. If we want to do that, we need to probe a lot deeper than just “senior engineers”.

    Let me offer a speculation, although not yet a serious hypothesis:

    1. SPECULATION
    Amongst technically-trained people, and ignoring any economic/ideological leanings:

    1) Some are used to having
    a) Proofs
    OR
    b) Simple formulae
    OR
    c) Simulations that provide exact, correct answers, and must do so to be useful
    d) And sometimes, exposure to simulations/models that they think should give good answers, but don’t.

    2) Whereas others:
    a) Are used to missing data, error bars,
    b) Complex combinations of formulae
    c) Models with varying resolutions, approximations, and that give probabilistic projections, often only via ensemble simulations.
    d) Models that are “wrong”, but very useful.

    My conjecture is that people in category 1) are much more likely to be disbelieving, whether in science, math, or engineering.

    2. ANECDOTAL EXAMPLES:
    1) In this thread, a well-educated scientist (Keith) was convinced that climate models couldn’t be useful, because he was used to models (protein-folding) where even a slight mismodel of the real world at one step caused final results to diverge wildly … just as a one-byte wrong change in source code can produce broken results.

    See #197 where I explained this to him, and #233 where light dawned, and if you’re a glutton for detail: #66, #75,l #89, #1230, #132, #145, $151, #166 for a sample.

    2) See Discussion here, especially between John O’Connor & I. See #64 and #78. John is an EE who does software configuration management. When someone runs a rebuild of a large software system, everything must be *perfect*. There’s no such thing as “close”.

    Also in that thread, Keith returned with some more comments (#137) and me with (#146), i.e., that protein-folding was about as far away from climate modeling as you could get.

    3) Walter Manny is a Yale EE who teaches calculus in high school. He’s posted here occasionally (Ray may recall him :-), and participated in a long discussion at Deltoid, and has strong (contrarian) views. In many areas of high school/college math, there are proofs, methods known for centuries, and answers that are clearly right or wrong.

    4) “moonsoonevans” at Deltoid, in #21 & #32 describes some reasons for his skepticism, #35 is where light dawns on me. He’s in financial services, had experienced many cases where computer simulations done by smart people didn’t yield the claimed benefits. In #35 I tried to explain the difference.

    All this says that if one is talking with an open-minded technical person, one must understand where *they* are coming from, and be able to give appropriate examples and comparisons, because many people’s day-to-day experience with models and simulations might lead them to think climate scientists are nuts.

    3. A FEW SPECIFIC DISCIPLINES & CONJECTURES

    1) Electrical engineers (a *huge* group, of which only tiny fraction are here)

    Many EEs these days do logic design, which requires (essentially) perfection, not just in the design, but (especially) in simulation.

    Design + input =>(logic simulator) => results

    At any step, the design may or may not be bug-free, but the simulator *must* predict the results that the real design would do given the input, exactly, bit for bit. Many test-cases have builtin self-checks, but the bottom line is that every test-case yields PASS or FAIL, and the simulator must be right.

    Many people buy simulators (from folks like Cadence or Synopsys), and run thousands of computers day and night simulating millions of test-case inputs. But, with a million test-cases, they’re not looking for an ensemble that provides a distribution, they’re looking for the set of test-cases to cover all the important cases, and for EVERY one to pass, having been simulated correctly. This has some resemblance to the protein-folding problem mentioned above.

    Now, at lower levels of timing and circuit design, it isn’t just ones and zeroes (there’s lots of analog waveforms, probabilistic timing issues, where one must guarantee enough margin, etc). When I’d tease my circuit designer friends “Give me honest ones and zeroes”, they’d bring in really ugly, glitchy HSPICE waveforms and say “so much for your ones and zeroes”. (This is more like the molecular “docking” problems that Keith’s colleagues mentioned.)

    At these levels, people try to set up rules (“design rules”) so that logic designers can just act at the ones-and-zeroes level.

    If one looks at EEs who worry about semiconductor manufacturing, they think hard about yields, failure attribution, and live with time-series. (Standard answer to “We got better yield this month, how do you think it looks?” was “Two points don’t make a trend.”

    2) Software engineers

    Programs often have bugs, but even a bug-free program can fall apart if you change the wrong one byte of code, i.e., fragile. (I don’t recall the source, but the old saw goes something like: if skyscrapers were like software, the first woodpecker would knock them over.)

    Configuration management / software rebuilds are fairly automated these days, and they must be correct. One cannot include the wrong version of code, or compile with incompatible options.

    Performance engineering and benchmarking tend to be more probabilistic-oriented, and although a lot of people want to believe in one number (once the mythical “MIPS” rating), we’ve (mostly) fixed that over the last 20 years. Good performance engineers have always given relative performance ranges and said YMMV (Your Mileage May Vary).

    3) Mechanical engineers

    This, I expect, varies. In some cases, closed-form equations work pretty well. In other cases, one is using big structural dynamics and computational fluid dynamics codes to obtain “good-enough” approximations to reality before actually building something. For example, automobiles are extensively modeled via “crash codes”.

    4) Petroleum engineers

    It’s been a while, but certainly, people who do seismic analysis and reservoir modeling *start* with data from the real world, analyze it to make decisions, so ought to be a little more accustomed to probabilistic analyses.

    5) Financial engineers (Google: financial engineering)

    Not having physics to constrain simulations yields some wild results, although at least, some people are very comfortable with risk, uncertainty, and ensemble projections. I especially like Sam Savage’s Flaw of Averages”.
    On the other hand, when Nobel Economists lose $B (LTCM), I’m not surprised there is skepticism about climate models.

    4. CONCLUSIONS

    That’s a speculative start. I do *not* think lumping a large group together as “senior engineers” helps progress, because I have at least anecdotal evidence that the sources of skepticism tend to be attached to the kinds of models and (maybe) simulations that someone does day-by-day. The problem is that many people tend to generalize from their discipline to others, and especially if they have trouble getting useful models, they tend to be suspicious of others’.

    At one extreme, people have long-established mathematical proofs, and answers that are clearly right or wrong.

    At the other extreme, people have to make decisions based on the best approximations they can get, and if their discipline has good-enough approximations, they tend to think one way, and if the approximations aren’t so good, they may think another about equations and climate models.

    Comment by John Mashey — 8 Sep 2008 @ 2:31 PM

  35. re: 28

    “given what is at stake.”

    If the likelihood of a ruinous car crash were greater than 2.5% per year, the insurance companies wouldn’t let me in the door. By the time we get to a rise of 2C, I think the issues of precaution and mitigation are going to be moot. One can’t divvy up probabilities exactly, but the IPCC pegs a rise in temps of between 1.5-4.5C at the 95% level of certainty. Do you think you could get car insurance at that level of risk?

    Comment by Jeffrey Davis — 8 Sep 2008 @ 2:32 PM

  36. Re 27: Jim Yeager:
    “Sounds like the time to invoke the precautionary principle.”

    It depends on the cost of the precaution, the magnitude of adverse consequences, and the degree of certainty.

    If, to bring down CO2 levels to 275 ppm or so, we need a global WWII type mobilization, rationing, and restriction of freedoms, then we better have one heck of a solid case!

    Comment by Fred Jorgensen — 8 Sep 2008 @ 2:41 PM

  37. John Farley, #26. I followed your link, and read your footnote 15 which is of particular interest to those (like myself) who are looking for methods to estimate climate sensitivity without using GCMs:

    15. Someone who distrusts all models can examine the change in temperature and solar forcing between glaciation (ice age) and interglacial (no ice age) periods. The change in temperature, revealed in ice core samples, is 5˚C, while the change in solar forcing is 7.1 W/m2. The computed climate sensitivity is therefore 5/7.1 = 0.7 K(W/m2)-1. We can use this empirically derived climate sensitivity to predict the temperature rise from a forcing of 4 W/m2, arising from a doubling of the atmospheric CO2 from pre-industrial levels. The result is a predicted temperature increase of 3˚C, which is perfectly consistent with the IPCC predictions of 1.5 to 3.5˚C (Seinfeld and Pandis, Atmospheric Chemistry and Physics, 1102).

    The problem with this argument is the glacial – interglacial temperature increase is accompanied by a large decrease in the amount of ice and snow covering the earth, and hence in the amount of sunlight reflected back towards space. Today there is relatively little ice left, and what there is is near the poles which contributes little to reflection (the poles are dark for half the year and for the other half the sun is on the horizon so they present very little “cross-section” to the sun). Thus, we would expect a much stronger positive feedback from glacial to interglacial than we would today from a similar increase in forcing.

    [Response: That is already factored in since the ice sheets are imposed as a forcing in this kind of calculation. In general, you are probably correct - the existence of extensive snow/ice cover increases sensitivity, but for the ranges of climate change we expect in the next 100 or so years that does not seem to be a big effect. - gavin]

    Comment by mugwump — 8 Sep 2008 @ 2:44 PM

  38. Re Fred Jorgensen @36: “It depends on the cost of the precaution, the magnitude of adverse consequences, and the degree of certainty.”

    Fred, you left out the cost of not taking the precaution, and the magnitude of adverse consequences of not taking the precaution, yet you specifically set the bar at 275 ppm.

    Now why would you do that?

    Comment by Jim Eager — 8 Sep 2008 @ 3:17 PM

  39. Dear Maurizio Morabito #10,
    I think there is actually a more encouraging explanation to Spencer Weart’s observations, which also supports your statement.
    Firstly, at our excellent universities we train scientist and engineers in the basics of their respective fields during their undergraduate studies.
    Secondly, during their graduate studies we encourage them to go beyond the basics, discover new things, and go beyond the current knowledge and acquire new knowledge on their own. One way to do this is challenging hypothesis, theories, assumptions of complex models and I guess even sometimes the consensus.
    That is how progress is made at the universities and in the six sigma breakthrough methodology in the industry.
    Thirdly, in industry we also train them to look for simple rules and models that can be implemented with ease in a production process. This is sometimes necessary to make money.
    Actually, I think the linear feedback model that was adapted by climate science is just a typical linearization you would apply in industry to model a more complex problem.
    I also agree with you, many of them therefore understand the complexities of real world problems.
    Therefore, I am very optimistic that with the help of all those excellently trained scientists and engineers mankind will survive the warming.

    Comment by Guenter Hess — 8 Sep 2008 @ 4:11 PM

  40. Fred Jorgensen (36) — Don’t need any of that, just 1–2% of the world’s gross product for 70–100 years. Probably it can be done for quite a bit less eventually. That shouldn’t stop us from starting now.

    Comment by David B. Benson — 8 Sep 2008 @ 4:19 PM

  41. Spencer Weart, thanks for the great writeup.

    Gavin,

    concerning your last response to mugwump, I guess I assumed incorrectly that Milankovitch forcing was amplified by reflective and GHG feedbacks… but instead ice sheet responses are considered a forcing? I may just be getting tripped on words rather than anything radiatively significant, (I’m assuming that the “forcing” part is a reduction in surface albedo, though the semantics make no sense to me). How exactly are ice sheets to be considered a forcing, unless we consider freshening of the THC or something along those lines? I do agree with the comment though, that under situations of a same radiative forcing, two Earth’s (one with ice, one with little ice) would expect differences in sensitivity. But you’d have better insight to the actual quantitative aspects of that than me.

    [Response: It's the difference between the Charney sensitivity (fast feedbacks included, ice sheets, vegetation change not) and the Earth System Sensitivity (all feedbacks included). Most discussion has focused on the former, and that is the context for the LGM calculation. The ESS is different - and indeed if you try and calculate the ESS to orbital forcing you get extreme values (because the regionality/seasonality of the orbital forcing is so important for the ice sheets). - gavin]

    Comment by Chris Colose — 8 Sep 2008 @ 4:57 PM

  42. Doug H. wrote:
    > Does anyone know where can one obtain a fairly
    > detailed laypersons description of modern
    > bio-geo-chemo-climate models anywhere

    Yes, the IPCC, or the Start Here links at top of pag> that lists all the positive and negative feedbacks?

    Nope, it’s not such a simple question that all the answers are completely known even for a single moment in time, and they will change over time.

    Example (you can follow articles like this forward in time to see how the questions get worked on):

    from twenty-one years ago (before ocean pH change was noticed as an issue):
    http://www.see.ed.ac.uk/~shs/Global%20warming/Data%20sources/Charlson.pdf

    Comment by Hank Roberts — 8 Sep 2008 @ 5:16 PM

  43. Re 38: Jim Eager:
    The cost of not taking the precaution would be the ‘Adverse Consequences’ in the broadest term.
    NASA’s Jim Hansen recently set the bar at 350 ppm, but there have been calls for pre-industrial,
    historically stable levels of 280 or so.

    Re. 36: David B. Benson: But the world’s GDP isn’t the benchmark! China and India want US
    standard of living, – with the premium for low CO2 technology paid by the developed world.
    Much more than 1-2 % of our GDP I would think! And we ‘may’ only have 10 years or so before the
    speculative ‘tipping point’! Looks like an enormous project, so we’d better NOT start down that path
    without a very high degree of certainty, since the cost of ‘precaution’ (insurance), could
    be used to solve more critical problems elsewhere.

    Comment by Fred Jorgensen — 8 Sep 2008 @ 5:36 PM

  44. Hank Roberts Says:
    8 September 2008 at 2:24 PM
    “John Lang Says:
    8 September 2008 at 1:44 PM

    ” Since the GHG sensitivity estimates are completely based on …” and then states his misunderstanding.

    John, where did you get this mistaken idea? Did you read it somewhere? Could be you’re misreading a good source, or you’re being fooled by fake “facts” — ??”

    That is what the commentary by Spencer Weart says.

    Comment by John Lang — 8 Sep 2008 @ 5:57 PM

  45. Fred Jorgensen (43) — Well, when I worked it out, 1–2% was enough to

    (1) Deeply sequester, as biochar or torrified wood, enough carbon to offset the annual addictions of excess carbon to the active carbon cycle;

    (2) Just enough funding left to do the same with about 1% of the existing excess carbon; hence the need for 70–100 years.

    There are many variations on the above theme; it could be combined with some CO2 CCS; some of the torrified wood could directly replace coal; etc.

    The idea is simply to show that it can be done; we ought to start doing some of it right away. Moving away from fossil fuels will not alone suffice; just concrete production will surely continue to contribute about 0.4 GtC per year.

    Comment by David B. Benson — 8 Sep 2008 @ 6:04 PM

  46. John Mashey: Let me offer another speculation. “Senior Engineers” and other Senior Technical Leaders who are worth their salt, think more broadly and have a knack for telling scat from shinola. The people to whom you refer are stuck in a very narrow line of thinking and are unable to see the broader picture.

    If anyone out there thinks we engineers are a tough crowd, watch out for the bakers and steelworkers ;)

    Comment by The Wonderer — 8 Sep 2008 @ 6:12 PM

  47. The linked presentation should appeal to engineers,
    it explains surface temperature in an understandable way
    using only gravity, radiation and convection.
    The GHG effect is proved unnecessary.
    It would be appreciated if someone could point out the error
    or provide a link to the rebuttal.

    The Thermodynamic Atmosphere Effect
    - explained stepwise
    by Heinz Thieme
    German Version: http://people.freenet.de/klima/indexAtmos.htm
    Using a set of technical models of planets with and without an atmosphere the reasons are

    explained for differences in surface temperature of the planet without an atmosphere

    compared with the temperature in the ground layer of atmosphere of the other planet. The

    differences are caused by thermodynamic characteristics of these gases, which cause the

    mass and the composition of the atmosphere, and the atmospheric pressure, which results

    from gravity.
    Trying to avoid the spam filter, link to english version is at…
    http://www.geo(largetowns).com/atmosco2/atmos.htm

    [Response: Nonsense, I'm afraid. He assumes that the planet has an adiabatic lapse rate (heating with increasing pressure), a very specially chosen amount of mass and radiation only at the edge of the planet. This might work, but does not resemble the real world in any respect. - gavin]

    Comment by Richard Hill — 8 Sep 2008 @ 6:34 PM

  48. Interesting, there is no simple answer. I am curious why you didn’t reference the brilliant math used (2 plus 4 divided by 2 equals 3 degrees) to determine climate sensitivity. Tsonis et al has a paper on dynamical models that isn’t all that simple to understand. I do recommend it as a good read though.

    Comment by captdallas2 — 8 Sep 2008 @ 6:42 PM

  49. Thanks for your clear explanation of some of the basic concepts Spencer. Your book is very clear and well written too.

    One aspect of the “basic” physics that puzzles me is the lag time of CO2 from burning fossil fuels in the atmosphere.

    Archer (2007: 123) suggests, “The bottom line is that about 15-30% of the CO2 released by burning fossil fuel will still be in the atmosphere in 1000 years, and 7% will remain after 100,000 years.”

    Why is there “only” a lag of 800 years in the icecore record between the earth coming out of a glacial period and the response of CO2?

    Reference: Archer D (2007) “Global Warming: Understanding the Forecast” (Blackwell Publishing).

    Comment by Chris McGrath — 8 Sep 2008 @ 6:45 PM

  50. re #46: “That is what the commentary by Spencer Weart says.”

    Not anywhere that I could see; there is a lot about what is needed to model the greenhouse effect, to be sure, but nowhere a statement about what other evidence may or may not exist.

    For a bit more about the other evidence, see Gavin’s inline response to the original query at #28, with more at #37.

    Comment by Kevin McKinney — 8 Sep 2008 @ 7:06 PM

  51. > sensitivity estimates are completely based on

    John Lang, you say you read that in the Commentary.
    It’s not there. Search the words. Is it an interpretation of something you read? What?

    Comment by Hank Roberts — 8 Sep 2008 @ 7:44 PM

  52. Chris McGrath (49) wrote “Why is there ‘only’ a lag of 800 years in the icecore record between the earth coming out of a glacial period and the response of CO2?”

    That’s what is seen in the Antarctic ice cores, although there is a similar delay from Pacfic Warm Pool data. It is a balance between the warming oceans expressing CO2 and the changes in the terrestrial carbon pool.

    My amateur reading of the matter.

    Comment by David B. Benson — 8 Sep 2008 @ 7:54 PM

  53. Engineers work in the real world. When we build bridges, we make sure ‘they don’t fall down!’
    But the real story is: ‘Probably won’t fall down for 100 years.’,
    and then we schedule inspections and maintenance, because in the real world ‘sh*t happens!’ in
    spite of our elaborate models and building standards.
    So when climate scientists and theoreticians show us computer projections and fancy statistics, well,
    ‘We’re from Missouri!’.
    The simple graphs of steady CO2 increase and up and down temperature over the last 100 years or so, has
    this engineer slightly sceptical.
    Sorry. We’re a tough crowd!

    [Response: No. You are impossible crowd (well a crowd of one perhaps). How many times have you been told patiently that the impact of CO2 is not derived from correlations over the last 100 years, or 100,000 years? How many times have you been pointed to the physics of radiative transfer or estimates of climate sensitivity? It's clearly dozens. You haven't moved from your incorrect idea one iota. Why should anyone continue to discuss with you? - gavin]

    Comment by Fred Jorgensen — 8 Sep 2008 @ 10:16 PM

  54. RE John Mashey #34:

    Part of my skepticism does indeed derive from my day job, which involves pretty heavy duty computer modeling, but not of the climate. I know how easy it is to overfit when you snoop the test data. In fact, we don’t consider a model validated until we’ve tested it against completely unseen data. Climate modelers have spent years tweaking heavily parameterized models against a very limited set of data. They are almost guaranteed to have overfit.

    Another part of my skepticism is a kind of feedback. If you ask skeptical (but informed) questions of many prominent “mainstream” climate scientist bloggers you are often treated with disdain, or worse. That kind of sneering attitude only serves to make me more skeptical.

    A third part of my skepticism arises from the extent to which the global warming issue has been hijacked by environmentalists who seek to use it to further their political goals.

    The final part of my skepticism arises from the extent to which any published work that deviates from the “consensus” climate change view is immediately eviscerated by blogs such as this and by a flurry of negative “comment” responses in the open literature. The more rigorous the deviant research, the greater the attacks. In the words of WS himself, too often the lady doth protest too much.

    At present, the only rational position (in my opinion, of course), is a skeptical one.

    Comment by mugwump — 8 Sep 2008 @ 11:06 PM

  55. If you want to see how the other side thinks, see my latest interview in which I interviwed a man who does not believe in global warming. NSFW. But his views represent what a lot of Americans think and feel. It’s good to know the other side. I mean, what we are up against. Forget polar cities. This comes first.

    http://northwardho.blogspot.com

    And while I am here: a professor named Jurgen Scheffran, a research scientist in the Program in Arms Control,
    Disarmament and International Security and the Center for Advanced
    BioEnergy Research at the University of Illinois, is among those
    raising concerns. In a survey of recent research published earlier
    this summer in the Bulletin of the Atomic Scientists, Scheffran
    concluded that “the impact of climate change on human and global
    security could extend far beyond the limited scope the world has seen
    thus far.”

    Scheffran’s review included a critical analysis of four trends
    identified in a report by the German Advisory Council on Global Change
    as among those most possibly destabilizing populations and
    governments: degradation of freshwater resources, food insecurity,
    natural disasters and environmental migration.

    “Most critical for human survival are water and food, which are
    sensitive to changing climatic conditions,” Scheffran said.

    The degradation of these critical resources, combined with threats to
    populations caused by natural disasters, disease and crumbling
    economic and ecosystems, he said, could ultimately have “cascading
    effects.”

    “Most critical for human survival are water and food, which are
    sensitive to changing climatic conditions,” Scheffran said.

    “Environmental changes caused by global warming will not only affect
    human living conditions but may also generate larger societal effects,
    by threatening the infrastructures of society or by inducing social
    responses that aggravate the problem,” he wrote. “The associated
    socio-economic and political stress can undermine the functioning of
    communities, the effectiveness of institutions, and the stability of
    societal structures.

    These degraded conditions could contribute to civil strife, and,
    worse, armed conflict.”

    “cascading effects.”

    In addition to global cooperation, Scheffran believes that those
    occupying Earth now can learn a lot about the future by studying the
    past.

    “History has shown how dependent our culture is on a narrow window of
    climatic conditions for average temperature and precipitation,” he
    said. “The great human civilizations began to flourish after the last
    ice age, and some disappeared due to droughts and other adverse shifts
    in the climate. The so-called ‘Little Ice Age’ in the northern
    hemisphere a few hundred years ago was caused by an average drop in
    temperature of less than a degree Celsius.

    “The consequences were quite severe in parts of Europe, associated
    with loss of harvest and population decline,” Scheffran said. “Riots
    and military conflicts became more likely, as a recent empirical study
    has suggested.”

    Comment by danny bloom — 8 Sep 2008 @ 11:39 PM

  56. #47, Gavin’s inline response
    Gavin,
    The Thermodynamic Atmosphere Effect just uses Ray Pierrehumberts formula 3.8 from his online Climate Book as a model, don’t you think?
    Ts = (ps/prad)^(R/cp)*Trad

    [Response: That's the adiabat. But it isn't the greenhouse effect. - gavin]

    Comment by Guenter Hess — 8 Sep 2008 @ 11:44 PM

  57. Gavin et al.

    Is it possible to make the blog reaction always point to the shortest url: e.g. http://www.realclimate.org/index.php/archives/2008/09/simple-question-simple-answer-no and not e.g. http://www.realclimate.org/index.php/archives/2008/09/simple-question-simple-answer-no/langswitch_lang/tk

    I think that would be helpful…

    [Response: Click on the English flag on the sidebar. This behaviour is due to an intersection between the cache and the language-switch code. You can always remove the langswitch tag and reload. - gavin]

    Comment by Magnus Westerstrand — 9 Sep 2008 @ 1:11 AM

  58. I am now retired, but in my former career, I used to be in charge of the work done by the US government involved in evaluating computer models of nuclear power plants. These programs modeled core physics, fuel thermal-mechanical behavior, and thermal-hydraulics in the core and in the entire nuclear coolant system. These programs were developed over the past 60 years of the nuclear industry by the various nuclear vendors and by the government, as well, at a cost of billions of dollars.

    These programs are quite complex, having to take into account a large number of variables and factors, from basic uncertainties in the underlying physics, to uncertainties in material properties, to uncertainties associated with human performance and manufacturing. A LOT of this work was done at the behest of environmentalists in the early 1970s, who complained that the analytical models used to “prove” the safety of the plants was not complete or understandable. Dr. Weart should be very familiar with this effort, inasmuch as he write a very interesting book on nuclear power and nuclear issues (“Nuclear Fear”) in the 1980s.

    All of the models that I used to evaluate, and the ones that I used in the government to evaluate those produced by the vendors were extensively documented. The government developed a strict methodology to allow the quantititave evaluation of the uncertainty in the models, which is now used by all three reactor vendors for their loss-of-coolant accident models. The uncertainty methodology is sufficiently general that it can be applied to a wide range of computer models, including the atmospheric models described above. it starts with an evaluation of the state of understanding of the the physics, and the mathematical models used to describe the physics, and then proceeds to use a strict process to introduce variations in the important parameters and conflate the results of different “runs” to estimate the overall uncertainty of the model.

    Use of this method produces documentation that can be scrutinized by other engineers to determine the suitability of the model. I am highly suspect of modelers who say that their methods are too complex to document properly, especially if the consequences are so dire as they say. It is important for these models to be publicly available, on the internet, with full documentation of their underlying physics and all of the runs that have been made to produce the results.

    These are extraordinary claims (the end of the world) being made by the modelers, and they demand extraordinary proof.

    I am not a climate scientist, but if the climate scientists cannot produce documentation like this, then their claims cannot be believed.

    Comment by rxc — 9 Sep 2008 @ 2:03 AM

  59. Re 4. Spencer & 3. Andrew,

    the study is

    Diffenbaugh, Noah S., Filippo Giorgi, and Jeremy S. Pal, 2008. Climate change hotspots in the United States. Geophys. Res. Lett., 35, L16709, doi:10.1029/2008GL035075, August 30, 2008, online http://www.purdue.edu/eas/earthsystem/Diffenbaugh_GRL_08.pdf (8,52 MB)

    Comment by Timo Hämeranta — 9 Sep 2008 @ 2:36 AM

  60. About models please see

    Reichler, Thomas, and Junsu Kim, 2008. How Well do Coupled Models Simulate Today’s Climate? BAMS Vol. 89, No 3, pp 303-311, March 2008, online http://ams.allenpress.com/archive/1520-0477/89/3/pdf/i1520-0477-89-3-303.pdf

    “…, we note the caveat that we were only concerned with the timemean state of climate. Higher moments of climate, such as temporal variability, are probably equally as important for model performance, but we were unable to investigate these. Another critical point is the calculation of the performance index. For example, it is unclear how important climate variability is
    compared to the mean climate, exactly which is the optimum selection of climate variables, and how accurate the used validation data are. Another complicating issue is that error information contained in the selected climate variables is partly redundant. Clearly, more work is required to answer the above
    questions…”

    About climate sensitivity please see

    Rind, David, 2008. The Consequences of Not Knowing Low- and High-Latitude Climate Sensitivity. BAMS Vol. 89, No 6, pp. 855-864, June 2008, online http://ams.allenpress.com/archive/1520-0477/89/6/pdf/i1520-0477-89-6-855.pdf

    “Along with the continuing uncertainty associated with global climate sensitivity [2°–4.5°+, for doubled CO2 in the latest Inter-governmental Panel on Climate Change (IPCC) report], we have not made much progress in improving our understanding of the past/future sensitivity of low- and high-latitude climates. Disagreements in paleoclimate interpretations, and diverse results from the IPCC Fourth Assessment Report future climate model simulations suggest that this uncertainty is still a factor of 2 in both latitude regimes. Cloud cover is the primary reason for model discrepancies at low latitudes, while snow/sea ice differences along with cloud cover affect the high-latitude response. While these uncertainties obviously affect our ability to predict future climate-change impacts in the tropics and polar regions directly, the uncertainty in latitudinal temperature gradient changes affects projections of future atmospheric dynamics, including changes in the tropical Hadley cell, midlatitude storms, and annual oscillation modes, with ramifications for regional climates. In addition, the uncertainty extends to the patterns of sea surface temperature changes, with, for example, no consensus concerning longitudinal gradient changes within each of the tropical oceans. We now know a good deal more about how latitudinal and longitudinal gradients affect regional climates; we just do not know how these gradients will change…”

    Current state in Climatology in plain English…

    [Response: As always, what is your point? And why do you think papers that demonstrate a) that climate models have been getting better over time, and b) another that says that regional latitudinal gradients are hard to predict, have to do with the importance of complexity in explaining what is going on? Throwing in random citations with cherry-picked quotes is a well worn tactic, but please, try and be a little relevant. - gavin]

    Comment by Timo Hämeranta — 9 Sep 2008 @ 2:59 AM

  61. Chris, #49.

    The simplest answer is that in the ancient past there were no dino drilling platforms. So 10,000 years to sequester 7% of fossil fuels has nothing to do with the past CO2 expression.

    Comment by Mark — 9 Sep 2008 @ 3:28 AM

  62. I have a quick-and-dirty way to calculate planet surface temperatures here.

    Comment by Barton Paul Levenson — 9 Sep 2008 @ 3:33 AM

  63. Fred, #43. WHAT more pressing needs are there for insurance? And are you an economist to know what % of GDP would be needed? Have you worked on a model for it?

    And as a % of GDP of the US, what’s spent on securing oilfields annually? Not even fazing the current administration.

    Comment by Mark — 9 Sep 2008 @ 3:33 AM

  64. Gosta, #16

    No.

    The weather doesn’t have free will. It can’t change based on what it thinks will happen in the future.

    People can. People do.

    Comment by Mark — 9 Sep 2008 @ 3:45 AM

  65. Dr. Farley,

    I take on Cockburn here. It’s very reminiscent of shooting fish in a barrel. :)

    Comment by Barton Paul Levenson — 9 Sep 2008 @ 3:45 AM

  66. Fred Jorgenson posts:

    If, to bring down CO2 levels to 275 ppm or so, we need a global WWII type mobilization, rationing, and restriction of freedoms, then we better have one heck of a solid case!

    Fortunately, we don’t need all that — even though we do have one heck of a solid case! A carbon emissions trading scheme and a push for renewable sources of energy will work just fine.

    Comment by Barton Paul Levenson — 9 Sep 2008 @ 3:49 AM

  67. Guenther Hess says:

    Therefore, I am very optimistic that with the help of all those excellently trained scientists and engineers mankind will survive the warming.

    Mankind will probably survive, yes. The current civilization may not. Athens had the best philosophers in the world, but it didn’t stop Alexander the Great from annexing it. The Nazis had the most brilliant physicist in the world working for them — Werner von Heisenberg — but their atom bomb program failed miserably. You can probably think of additional examples.

    Comment by Barton Paul Levenson — 9 Sep 2008 @ 4:06 AM

  68. Dear Mr. Weart,

    You say greenhouse gas warming is not possible to predict but one piece of information I picked up watching an Australian scientific meeting mentioned a number of studies, including a University of Chicago study, predicts CO2 has no/little warming ability passed 22ppm (parts per million).

    This is based on CO2′s well known infra-red radiative profile, known and solid science for nearly a century.

    Namely once CO2 reaches 22ppm in Earths atmosphere it effectively reaches its limits of warming ability as CO2 is only able to reflect/force only certain infra-red wavelengths, that’s achieved/blocked at 22ppm and past that any additional CO2 (to 380ppm or 600ppm) does no additional ‘work’.

    Like sunglasses with 2 of say 4 filters, it doesn’t matter how many extra pairs you wear, 2 sections of light always get through and 2 are always blocked. Add 20 new pairs with the same light filters and it makes no difference!

    Water vapour, the most powerful GHG, has a very broad infra-red wavelength forcing ability in contrast.

    As I say this has been known science for nearly a century and for me is the killer punch, amongst many, that KO’s the global warming hoax. Can you confirm please?

    [Response: None of this true. Where is your source and why do you think it's trustworthy? For more details about why this is wrong, read these previous posts (here and here). - gavin]

    Comment by JohnnyB — 9 Sep 2008 @ 5:45 AM

  69. Johhny B, #56.

    Put one blanket on the bed. You are now insulated 100% against losses from radiation and convection. So putting a second blanket on is a waste of time, isn’t it? After all, everything that a blanket stops is already stopped by the one blanket you have on now.

    Comment by Mark — 9 Sep 2008 @ 6:00 AM

  70. So, Spenser, it’s Science but not as we know it. No assumptions which can be tested from first principles, no equations which can be checked, and nothing which can be tested against experiments or measurements. Prior experiments (Angstrom’s saturation test in 1905) are dismissed as “botched” (by Gavin) but not repeated.

    [Response: I said no such thing, and these experiments have been repeated hundreds of time at much higher accuracy (look up HITRAN). - gavin]

    Quantum theory is invoked as photons like little silver bullets random-walking their way into space.

    At the long end of the spectrum the dominant behaviour in the wave-particle duality is the wave. Absorption is resonant. “We see, to our amazement, that the photon frequency associated with the transition is just an integer multiple of the classical resonant frequency of the harmonic oscillator” from a First Course in Atmospheric Radiation, page 252, by Grant W Petty, University of Wisconsin Madison.”

    Not to my amazement, I have to say. I was regurgitating these equations 50 years ago.

    Your comment “the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases” ignores the second law of thermodynamics. Heat cannot pass from a cooler to a warmer body. Check it out. Switch off the fridge and open the door.

    [Response: Oh please. The 2nd law is for *net* transfers. We receive radiation from the big bang at a brightness temperature of about 3K. Your claim would preclude us detecting it because energy can't come from a colder body (deep space) to a warm one. Nonsense. - gavin]

    Robert Essenhigh is one senior engineer who rejects AGW out of hand because the dominant radiation absorber in the atmosphere must be water vapour. He calculates the height at which all the radiation which can be absorbed, will be absorbed, finds this to be relatively close to the surface, and concludes that additional CO2 can make no difference.
    “The absorption coefficient is 1–2 orders of magnitude higher than the coefficient values for the CO2 bands at a concentration of 400 ppm. This would seem to eliminate CO2 and thus provide closure to that argument.”

    [Response: Only because he doesn't consider what happens higher in the atmosphere where the situation is very different. This idea that somehow radiative transfer is completely wrong is belied everytime you look at a picture derived from a satellite - if we were so wrong, none of those remotely sensed pictures of water vapour, or air pollution, or IR or aerosols or trace gases or ozone or anything would work. More nonsense. - gavin]

    Chuck Wiese on Roger Prielke’s web-site reaches the same conclusion:
    “The spectral overlap (C02/H2O) is quite severe, and is precisely why the results (of doubling CO2) don’t produce much change.”

    [Response: Whether 4 W/m2 is "much change" is a matter of opinion. I consider half the forcing associated with the transition from a glacial to an interglacial quite a lot. - gavin]

    Barton Levenson (62) published an elegant paper (The Irrelevance of Saturation) on this site which accepts the premise but refutes the conclusion. He applies conservation of energy across space, an upper and a lower atmospheric layer, and the surface. The upper layer contains negligible water vapour, and is warmed by any additional CO2. Because energy is radiated downwards, the lower layer and the surface have to compensate by increasing their radiation which, via Stefan-Bolzmann, requires warming.
    If the relative absorption in the upper atmosphere increases from 0.5 to 0.6 Barton generates an increase of 3 degrees C at the surface.

    But if the relative absorption is a far more realistic 0.01 (zero water vapour, very low pressure), doubling the CO2 produces a temperature increase of only 0.2 degrees at the surface, which is not really measurable.

    Elsewhere on the site another of your contributors summarised neatly the (unquantified) “higher is colder” argument – “And as adding greenhouse gases to the atmosphere increases the optical thickness or depth of the medium, it raises the height from which photons escape, and assuming a roughly constant lapse rate this will imply a warmer surface”.
    Possibly, but what evidence is there for a “roughly constant lapse rate”?
    The UAH data from 1978 quotes:

    Lower Troposphere – 1.3 degrees per century
    Mid Troposphere – 0.5 degrees per century

    These figures are not compatible with AGW, particularly since most of the lower troposphere warming took place between December 1999 and January 2002. Current temperatures are back to 1978 levels, themselves the trough of changes since the previous peak, in the forties.

    I am, incidentally, pleased to see that the rise in temperature from the Little Ice Age to the forties is now accepted on the site, without any attribution to AGW.

    [Response: "now"? please find some cites that indicates that our explanations have changed in any major respect. - gavin]

    Comment by Fred Staples — 9 Sep 2008 @ 6:59 AM

  71. “These people, typically senior engineers, get suspicious when experts seem to evade their question.”

    Before I became a statistician I was a chemical engineer and then a biomedical engineer.

    It is perhaps the evasiveness that enginees are responding to. Engineers tend to be distrustful of pat answers. But I would have thought physicists would have been more of a problem in that they prefer to see an eplanation reduced to first principles. (Admitedly much of the physics engineers encounter at university is of this variety.)

    Engineers are not strangers to complexity – although this may vary by dscipline. Much of what they do cannot be reduced to first principles and is ultimately empirical. Chemical engineers (& others) often work with dimensionless equstions. Relationships of quantities that have been expeimentally derived, and since they are dimensionless, can be scaled to the situation at hand. Furthermore engineers often resort to computer models in the design of chemical reactors, structures, circuits etc. They could no more summarise one of their own complex systems in 6 pages than you can summarise AGW.

    I suspect it is a culture clash that is causing the problem, not an incapacity to accept that a simple derivation is all that is required. For most engineers, Gavin’s “The CO2 problem in 6 easy steps” *should be* and excellent start.

    Comment by Bruce Tabor — 9 Sep 2008 @ 7:36 AM

  72. I see that my point (#10) is more or less repeated by other commentators (eg #53 and #58).

    Gavin: you reply to #53: “Why should anyone continue to discuss with you?”

    [Response: It wasn't rhetorical, it was serious. I spend way too much time replying to comments because I do genuinely want people to understand what it is we are talking about and why. But even my time is limited, and so one needs to prioritise. People that just want to declaim, rather than learn, just create noise and responding to their pseudo-questions is of limited use (other than to point out that they rarely make any sense). I judge the worth of engaging with people by their willingness to listen and modify their position. If they don't change at all, why bother? I'm not doing this for my health. - gavin]

    If you really want to communicate, then you better find a way of communicating. If on the other hand you don’t want to communicate, there is little point in replying to comments, really.

    In fact you remind me of those English-speaking tourists arriving back home in frustration, convinced that the locals they visited are brainless idiots, after having shouted, yelled, huffed and puffed to make themselves understood…by people that simply do not speak English.

    If you or Mr Weart want to speak to engineers, or anybody else, then you both better speak in a way that engineers can understand. And if they don’t appear to have understood, you cannot simply jump to saying “why are you people so slow to understand?”…the only sensible option is to see where the miscommunication is (yes, it can be with you too), and to work to fix that.

    I have provided a few suggestions already.

    People do have various degrees of skepticism in the nature and dangers of anthropogenic global warming. How difficult is it to recognize that? If you instead poo-poo their thoughts whenever expressed, you will win nobody’s mind. Fine by me, but then what’s a blog for?

    Comment by Maurizio Morabito — 9 Sep 2008 @ 7:44 AM

  73. I forgot to add the most significant scientific reason behind my skepticism at #54: the uncertainty in forcing due to aerosols and ocean heat uptake is almost as large as the forcing from human GHGs itself.

    Every climate model from a simple “single-box” through to global circulation models with multiple layers and interactions has built into the denominator of climate sensitivity the actual forcing – ie the difference between CO2 forcing and the offsets due to aerosols and ocean. Since the offsets are highly uncertain, the denominator of climate sensitivity has error bars that come close to encompassing the origin (zero), which means the climate sensitivity estimates themselves have error bars that approach infinity (in the extreme).

    It is this enormous range in estimated climate sensitivity that is invoked by the likes of Stern and Al Gore (and the alarmist industry in general) to claim that there is a non-negligible probability of total climate catastrophe, and that therefore we must act now to drastically curb CO2 emissions.

    But the reality is much more sanguine. We simply cannot attach a probability to wildly high estimates of climate sensitivity from climate models, borne as they are out of an uncertainty in the denominator.

    To see this, consider an extreme boundary case: there is a small (but non-zero) probability that the net forcing has in fact been zero over the 20th century, once you account for aerosols, ocean heat uptake, clouds, etc. In that case the sensitivity required to reproduce the 20th century temperature increase is infinite. But we know there is zero probability of infinite climate sensitivity (otherwise we wouldn’t be here discussing this), so clearly it is not valid to convert uncertainty in forcing into a probability for climate sensitivity based on the output of climate models.

    [Response: You touch on a real point, but you are not correct in your assessment. The issue of the forcing uncertainty is a problem for the twentieth century, and it does mean that high sensitivities cannot be ruled out from considering that period alone (Forest et al, etc). However, this isn't anything to do with climate models. They have very well defined sensitivities since you can impose whatever forcing you want. But even better is to take a situation where the forcing is unambiguously non-zero - the last glacial period is the best bet for this and actually rules out with high probability both the very low and very high numbers (as we have frequently discussed). - gavin]

    Comment by mugwump — 9 Sep 2008 @ 8:12 AM

  74. #34 John Mashey,

    Perhaps some of the difference between these two categories is in the numbers of components the sysystems that they deal with have and how crucial individual components can be.

    Category 1 deal with what could be described as serial systems. A few crucial components have many opportunities to affect the system and the effects of individual components and processes are not swamped by the effects of other components and processes. Components are often in one of a few discrete states. The system is a dictatorship with the crucial components as dictators.

    Category 2 deal with what could be described as parallel systems. Many components simultaneouly affect the system. The effects of any single component can be swamped by the affects of many other components. Thus even if you have mad an error about the effects of an individual component your estimates will usully be fairly robust. Components usually are in one of many states or the states have continuous values. The outcome is decided by a vote among many components.

    Comment by Lloyd Flack — 9 Sep 2008 @ 8:22 AM

  75. mugwump at #54,

    The last 3 of the 4 reasons for your skepticism do not relate to the technical accuracy of the scientific work on climate change, but rather to the political and emotional context of the debate. By “emotional” I include your own emotional response.

    Surely the issue is whether AGW is a real and present danger. Are the predictions accurate. Are the changes observed in the world’s climate illusionary or ephemeral – eg the huge loss or Arctic sea ice – or are they part of a dangerous trend.

    Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes. I have no doubt that, despite claims to the contrary, some parameters are “tweaked” at the margins, as the relevant physics is not fully understood. But these are additive processes and for the most part, leaving these out (eg aerosol effects) results in similar effects – Gavin could possibly confirm this. Older simpler models show little deviation from current ones.

    Skepticism becomes denial when you refuse to evaluate your position in light of the available evidence. Your last three points make me wonder if this is the case.

    Comment by Bruce Tabor — 9 Sep 2008 @ 8:28 AM

  76. Gavin at 57,

    Yes I know, however I guess it confuses a few that e.g. don’t blog themselves…

    Comment by Magnus Westerstrand — 9 Sep 2008 @ 8:28 AM

  77. Re 68. Johnny B, about Water vapour, the most powerful GHG, two interesting papers have been published:

    1. Huang, Yi, and V. Ramaswamy, 2008. Observed and simulated seasonal co-variations of outgoing longwave radiation spectrum and surface temperature. Geophys. Res. Lett., 35, L17803, doi:10.1029/2008GL034859, September 4, 2008

    “We analyze the seasonal variations of Outgoing Longwave Radiation (OLR) accompanying the variations in sea surface temperature (SST) from satellite observations and model simulations, focusing on the tropical oceans where the two quantities are strikingly anti-correlated. A spectral perspective of this “super-greenhouse effect” is provided, which demonstrates the roles of water vapor line and continuum absorptions at different altitudes and the influences due to clouds…”

    2. Dessler, Andrew E., P. Yang, J. Lee, J. Solbrig, Z. Zhang, and K. Minschwaner, 2008. An analysis of the dependence of clear-sky top-of-atmosphere outgoing longwave radiation on atmospheric temperature and water vapour. J. Geophys. Res. – Atm., 113, D17102, doi:10.1029/2008JD010137, September 3, 2008

    “We have analyzed observations of clear-sky top-of-atmosphere outgoing longwave radiation (OLR) … We also analyze the sensitivity of OLR to changing surface temperature Ts, atmospheric temperature Ta, and atmospheric water vapor q. We find that OLR is most sensitive to unit changes in Ta when that change occurs in the lower troposphere. For q, the altitude distribution of sensitivity varies between the midlatitudes, subtropics, and the convective region… In the tropical convective region, a rapid increase in q in the midtroposphere leads to a dramatic reduction in OLR with increasing Ts, which has been termed the “super greenhouse effect”.”

    Comment by Timo Hämeranta — 9 Sep 2008 @ 8:32 AM

  78. Re 72. mugwump, about aerosols please see

    Rosenfeld, Daniel, Ulrike Lohmann, Graciela B. Raga, Colin D. O’Dowd, Markku Kulmala, Sandro Fuzzi, Anni Reissell, and Meinrat O. Andreae, 2008. Flood or Drought: How Do Aerosols Affect Precipitation? Science Vol. 321, No 5894, pp. 1309-1313, September 5, 2008

    “Aerosols serve as cloud condensation nuclei (CCN) and thus have a substantial effect on cloud properties and the initiation of precipitation. Large concentrations of human-made aerosols have been reported to both decrease and increase rainfall as a result of their radiative and CCN activities. At one extreme, pristine tropical clouds with low CCN concentrations rain out too quickly to mature into long-lived clouds. On the other hand, heavily polluted clouds evaporate much of their water before precipitation can occur, if they can form at all given the reduced surface heating resulting from the aerosol haze layer…”

    About climate sensitivity please see first

    Kiehl, Jeffrey T., 2007. Twentieth century climate model response and climate sensitivity. Geophys. Res. Lett., 34, L22710, doi:10.1029/2007GL031383, November 28, 2007

    “Climate forcing and climate sensitivity are two key factors in understanding Earth’s climate. There is considerable interest in decreasing our uncertainty in climate sensitivity. This study explores the role of these two factors in climate simulations of the 20th century. It is found that the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity. Much of the uncertainty in total anthropogenic forcing derives from a threefold range of uncertainty in the aerosol forcing used (i.e. tuned) in the simulations…

    [20] Finally, the focus of this study has been on anthropogenic forcing. There is also a range of uncertainty in natural forcing factors such as solar irradiance and volcanic aerosol amount. It would of value to reduce uncertainties in these forcing factors as well.”

    And finally

    Allen, Myles R., and David J. Frame. Call Off the Quest. Science Perspective Vol. 318, No 5850, pp. 582-583, October 26, 2007, online http://www.eci.ox.ac.uk/publications/downloads/frame07-sensitivity.pdf

    “…An upper bound on the climate sensitivity has become the holy grail of climate research. As Roe and Baker point out, it is inherently hard to find. It promises lasting fame and happiness
    to the finder, but it may not exist and turns out not to be very useful if you do find it. Time to call off the quest.”

    Well, time to call off the quest, and follow the suggestion of the authors:

    “In reality, of course, our descendants will revise their targets in light of the climate changes they actually observe.”

    Comment by Timo Hämeranta — 9 Sep 2008 @ 8:54 AM

  79. RE Gavin @ 73:

    However, this isn’t anything to do with climate models. They have very well defined sensitivities since you can impose whatever forcing you want.

    Unfortunately, those pushing the alarmist agenda are using the higher sensitivity estimates from the models to further their political goals.

    It also goes to the question of how likely we are to overfit when tuning the climate models. When you have knobs on your black box that generate such widely varying behaviour when rotated through relatively plausible ranges, reliable extrapolation becomes problematic.

    But even better is to take a situation where the forcing is unambiguously non-zero – the last glacial period is the best bet for this and actually rules out with high probability both the very low and very high numbers (as we have frequently discussed).

    Fair enough. Do you have a canonical RC reference? (There are a *lot* of posts on RC to wade through).

    Without expecting an inline response here (I can read whatever you point me to), I am particularly curious about the very low numbers – what constitutes “very low” and how they are ruled out (very low for the IPCC seems to be 2C but from my reading I am starting to think that 2C is at the upper end of the plausible range).

    [Response: Start here (more here). - gavin]

    Comment by mugwump — 9 Sep 2008 @ 9:26 AM

  80. #77–

    Sounds like more bad news for the Lindzen/Spencer postulate that water vapor will provide negative feedback, yes?

    Comment by Kevin McKinney — 9 Sep 2008 @ 9:34 AM

  81. You do engineers a disservice we can understand the science, even if it is not our speciality. Physics is a key subject for engineers.
    What engineering teaches us is to reconcile the theory with the concrete. Many “Climatologists” do not seem to check their hypotheses with the observed data. Take the IPCC chart in AR4 that gives the projection of temperature against CO2.It starts at 280ppm and by the time CO2 gets to today’s 385ppm it indicates the temperature should be in range +1.0C to +2.2C, but the actual temperature rise is only +0.6C. An Engineer would ask, “Why”!
    It looks like the models are wrong. Perhaps you need some engineers to ask the “Why” question.
    Can you explain why the projections do not match the actual data?

    [Response: But they do! (Fig SPM.4). I don't know what other figure you are talking about (reference), but the two factors that might be missing is the lag in the ocean response to forcing, and the net effect of all the forcings (incluiding other GHGs, aerosols and land use etc.). - gavin]

    Comment by William — 9 Sep 2008 @ 9:34 AM

  82. Bruce Tabor @ #75:

    I elaborated on the scientific grounds for my skepticism at #73.

    Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes. I have no doubt that, despite claims to the contrary, some parameters are “tweaked” at the margins, as the relevant physics is not fully understood. But these are additive processes and for the most part, leaving these out (eg aerosol effects) results in similar effects – Gavin could possibly confirm this. Older simpler models show little deviation from current ones.

    Leaving out aerosols most assuredly does change the output.

    The models generate widely varying estimates of climate sensitivity from 20th century observations. I always wondered why, given that as you say they are all supposedly modeling the same physical processes. It was only recently that I looked into it sufficiently to realize that you have to divide by net forcing to get sensitivity (either explicitly as in a box-model, or implicitly via the underlying physics in a GCM), and the error bars on net forcing come perilously close to the origin.

    Comment by mugwump — 9 Sep 2008 @ 9:52 AM

  83. Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes.

    Mugwump has been told this before, as recently as a couple of weeks ago on Deltoid, but continues to make the “overfitting claim” regardless.

    The first time is forgivable ignorance. Continuing to repeat it is [edit]

    Comment by dhogaza — 9 Sep 2008 @ 10:13 AM

  84. rxc above wrote wishing for climate models to be as precisely documented as nuclear power plant models.
    I’m sure the climate modelers would like to do that.
    Each piece put into a nuclear power plant is specified in advance and you know what it’s made of.
    I’m sure the climate modelers would like comparable information.

    Suppose you were trying to model a nuclear power plant you couldn’t take apart?
    But you vary one factor you’ve calculated will make a change in its behavior and it’s changing?

    Willy Ley: “analysis is all very well but you can’t tell how a locomotive works by melting it down and analyzing the mess” — aren’t you glad fission plant modelers aren’t often faced with trying to do that? Do you understand why climate modelers have a difficult task? Do you understand why climate modelers raise concerns when they see CO2 increasing and can predict the climate system will be going outside its known safe performance parameters?

    What’s the worst thing that can happen, with either physical system?

    Right. Precautions are appropriate before fiddling with the inputs. Restrain those who would meddle, eh?

    Comment by Hank Roberts — 9 Sep 2008 @ 10:37 AM

  85. mugwump wrote: “Unfortunately, those pushing the alarmist agenda are using the higher sensitivity estimates from the models to further their political goals.”

    With all due respect, that statement — “pushing the alarmist agenda to further political goals” — is a dead giveaway that you are a pseudo-skeptic, someone who concludes that the overwhelming consensus of the scientific community that anthropogenic global warming is real, and dangerous, MUST be wrong because you dislike what you imagine to be the “political” consequences of that reality.

    Comment by SecularAnimist — 9 Sep 2008 @ 10:45 AM

  86. Re 80. Kevin, well, Lindzen and Spencer are not studying water vapor, instead please see first Lindzen’s latest

    Rondanelli, Roberto, and Richard S. Lindzen, 2008. Observed variations in convective precipitation fraction and stratiform area with sea surface temperature. J. Geophys. Res. – Atmos., 113, D16119, doi:10.1029/2008JD010064, August 29, 2008

    “This paper focuses on the relation between local sea surface temperature (SST) and convective precipitation fraction and stratiform rainfall area from radar observations of precipitation…
    Although a dependence on temperature such as the one documented is consistent with an increase in the efficiency of convective precipitation (and therefore consistent with one of the mechanisms invoked to explain the original Iris effect observations) this is but one step in studying the possibility of a climate feedback. Further work is required to clarify the particular mechanism involved.”

    [edit - limit your random quotes to at least peer reviewed papers]

    Comment by Timo Hämeranta — 9 Sep 2008 @ 10:52 AM

  87. One of the difficulties for sceptics (like me) is that, even if the complexities are understood and believed, when the analysis finally gets to the end, the conclusions turn into very simple and exact assertions with absolute certainty. The final simple answers are usually accompanied with words like certainty, absolute, consensus, incontrovertible, irrefutable, undeniable, etc. As an example, Spencer’s commentary, IMO, explained the reality of the science very well (though I agree he unfairly characterizes engineers — but that’s a nit point); then the very first post said, in essence, ‘and so X gives you exactly Y’ (followed by a ‘and don’t give me any back talk’ — again a minor point) as did a number of following posts. It seems like gaining an understanding of the complexity of the science is not desirable per se but just a necessary evil only if the simple conclusions are called into question. To a sceptic this is not very convincing.

    As a bit of mitigation, I have to leave the above as a “difficulty” and can’t quite raise it to a “criticism.” Because: 1) I’m not sure if there is any other realistic and practical way to do it. I mean a scientist can’t just end his/her analysis with a “nobody knows anything”. 2) While scientists in the field are guilty (IMO) of the above, it’s considerably more evident in the scientist wannabes, and who are also much less likely to backpedal when called than are the climatologists. 3) Some of my fellow sceptics (by happenstance, not choice) are guilty of the same thing. I don’t have any helpful suggestions to offer so I just have to live with my difficulty. But it is a real difficulty none-the-less.

    Comment by Rod B — 9 Sep 2008 @ 11:04 AM

  88. ps there is another factor that I think not pure science but none-the-less rational and expected. If a climatologist makes his/her best analysis with no more than a, say, 50-50 probability, but the 50% probability causes disastrous effects bordering on Armageddon, it’s understandable (maybe even necessary) that the scientist will push that scenario. Though this really has no relevance to this discussion…

    [Response: No it isn't. It's appropriate to say what the odds are. (though frankly, I wouldn't get on a plane with 50:50 odds of crashing). - gavin]

    Comment by Rod B — 9 Sep 2008 @ 11:19 AM

  89. John Mashey – I’m just bowled over by your explanation of the ways that different types of engineers look at climatology. My father is a retired ME engineer and I have always felt a huge gulf between us as to our ways of thinking, and this is food for thought. I have copied your comment and emailed it to several people – a friend who is trained as a geologist but is now a database programmer and who doesn’t believe in evolution, a long time friend whose husband is an ME engineer and with whom I have had many conversations about ME’s, and a few others with whom I have discussed how people solve problems differently.

    Comment by TEBB — 9 Sep 2008 @ 11:19 AM

  90. Timo–thanks.

    However, it still seems to me that this anti-correlation noted in the papers you cited would be essentially antagonistic to what Lindzen is trying to show: a negative feedback mechanism involving water vapor and cloud.

    Further enlightenment is welcome. . . I can’t claim to understand this area well at all. .

    Comment by Kevin McKinney — 9 Sep 2008 @ 11:54 AM

  91. Re 88-89. Rod,

    the IPCC correctly states:

    “…the future is inherently uncertain…”

    Ref: IPCC AR4 WG III SPM final sentence

    Re Gavin: the TAR 95 % confidence level was OK, the AR4 90 % isn’t.

    [Response: You think that that TAR was more certain that AR4? Read the reports again. - gavin]

    Comment by Timo Hämeranta — 9 Sep 2008 @ 12:05 PM

  92. “Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes.”

    [edit - no ad homs]

    However, for the record, any model with free parameters that are estimated from data can be considered statistical models that are “fit” to the data. GCMs have many free parameters, including aerosol content at any point in time, various parameterizations of cloud processes, and many other physical constants that are only known to a fairly crude precision. climateprediction.net has a much more comprehensive list if you’re interested.

    Now, GCM modelers don’t necessarily estimate the free parameters by directly inferring them from model behaviour, but they certainly adjust the parameters to get the model output to “fit” the 20th century instrumental record (it could be argued that the fit between models and the 20th century instrumental record is the sine qua non of progress in the field).

    Once you have models with parameters than can be adjusted to fit data you have the risk of overfitting, regardless of how that fitting takes place.

    [NB, I am not saying any individual climate model does or does not overfit, but the potential is certainly there. The wide range of climate sensitivity estimates produced by the models, and the underlying cause of that wide range (uncertain net forcing in the denominator), should make us very skeptical of the high sensitivity estimates]

    Comment by mugwump — 9 Sep 2008 @ 12:16 PM

  93. Previously, if I’d encountered `people who are looking for a straightforward calculation of the global warming that greenhouse gas emissions will bring. What are the physics equations and data on gases that predict just how far the temperature will rise?’, then I would have been given to understand that they didn’t want two significant figure, or even one significant figure, precision – just the right order of magnitude. Therefore, I would have happily pointed them at a model without feedbacks or super-accurate absorption spectra – perhaps Arrhenius (1896, Philos. Mag. fifth series 41(251): 237-276) – making clear to them the approximate nature of the quantitative results, of course. Having read Spencer’s post, I’m worried I might have been taking the wrong approach. What do folks think?

    Comment by danh — 9 Sep 2008 @ 12:21 PM

  94. Hi all. I’m surprised that anyone still believe that the earths behavior could be reduced to an elegant (?) algorithm. It’s more of a chaotic system than a static one, even though it to our time sense may be perceived as rather stable and, ah, unchanging. Still ‘weather’ is very much influences from all over the Earth acting in three dimensions (+ time). Consider a hurricanes creation development and path. Can anyone give me the ‘pinpointing’ algorithm for that?

    When we go down to quantum mechanics nothing seems to exist and in the end all become probabilities. And when we try to ‘slice up’ weather the best we will get is isolated approximative answers mostly relating to those specific parameters we defined it by. The ‘world’ seems to me more than just its parts so we definitely need complex simulations wherein we put all those patterns and what we think binds them together.

    Maybe we will find some good algorithms but I don’t think it will be before we have created simulations that is proved to ‘work out’. Math is a science as well as as a ‘hermetic’ knowledge but this thing about weather is non-linear math, not linear as far as I see it.

    ” Non-linear equations can be extremely difficult, and are often
    insoluble. For example the long-term behaviour of three bodies moving
    under gravity cannot be solved mathematically, though we can work a
    long way into the future. Unfortunately, non-linear, partial
    differential equations are exactly the equations that describe most
    real-life situations, like weather systems, frictional or turbulent
    motion, and so on. Before computers came along these equations were
    handled by making sweeping simplifications – the trick was to know
    which simplifications were justified. Although we still can’t SOLVE
    the equations we can run computer models which give a very good idea
    of what will actually happen. ”

    the quote from http://mathforum.org/library/drmath/view/53603.html

    Comment by Yoron — 9 Sep 2008 @ 12:32 PM

  95. RE #85,

    SecularAnimist, higher estimates of climate sensitivity from GCMs and the 20th century instrumental record are suspect for the reasons I gave. And there are a lot of prominent people who are using those higher estimates to push their policy agenda. It’s just a fact. I don’t see how that observation makes me a “pseudo skeptic”.

    Anyway, I won’t respond to any more personal questioning of my motives. This is already degenerating. I’d rather spend my time investigating how the lower bounds on climate sensitivity are derived from the last glacial – starting with the links gavin kindly provided in #79.

    Comment by mugwump — 9 Sep 2008 @ 12:32 PM

  96. Rod, I believe that I have recommended this essay to you in the past:
    http://ptonline.aip.org/journals/doc/PHTOAD-ft/vol_60/iss_1/8_1.shtml

    However, you bring up an interesting point–how do we wend our way through all the caveats to “scientific certainty. You are no doubt familiar with Sherlock Holmes stating (in many different stories) some version resembling: “How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?” (Note: this one’s from “The Sign of the Four”.) That really is how the whole scientific consensus thing works–eventually an idea just becomes too central to progress in understanding that the science becomes unthinkable without it. Put another way, if you reject that idea, you won’t get many publications, because your ideas won’t advance understanding of the science.
    However, not every aspect of a complicated theory will achieve this level of acceptance at the same time. Some aspects may remain uncertain or even controversial even as others are incontrovertible. When you accept that some aspects of the theory are incontrovertible, then regardless of the uncertainties, you may be able to draw some conclusions based only on the “certain” part of the theory.
    CO2 greenhouse warming has some very special properties. The well-mixed nature of CO2 mean that it acts as a ghg up to very high in the troposphere. The long life in the atmosphere means that it will keep on giving for a very long time. What is more, there is nothing in our understanding to suggest that the physics changes appreciably up to very high CO2 levels.

    So it is a virtual certainty that increasing CO2 will change the climate, and since the climate affects all aspects of human civilization, it is also a virtual certainty that we need to be concerned.

    Comment by Ray Ladbury — 9 Sep 2008 @ 12:33 PM

  97. Climate sience is though not simply. But I can not get away from the suspicion that RC et al. want to have it complicated. This because effects (warming) desperately are tried to get linked to wrong cause (CO2) instead of a more realistic cause (GCR). At the best this is due to normal scientific progress at the worst due to much vested interest coupled with political agendas and bias.

    It is my strong opinion that once causes and effect are put together free from other than sience it will all fall out beatuifully.

    [Response: There is no trend in GCR. It's really that simple. - gavin]

    Comment by Magnus — 9 Sep 2008 @ 1:21 PM

  98. Gavin,
    sorry for my short version in #47 in the previous comment, but I don’t think it is that easy
    This is how Ray Pierrehumbert in his climate book describes the greenhouse effect:
    ” ..In a nutshell, then, here is how the greenhouse effect works: From the requirement of energy
    balance, the absorbed solar radiation determines the effective blackbody radiating temperature
    Trad. This is not the surface temperature; it is instead the temperature encountered at some
    pressure level in the atmosphere prad, which characterizes the infrared opacity of the atmosphere,
    specifically the typical altitude from which infrared photons escape to space. The pressure prad is
    determined by the greenhouse gas concentration of the atmosphere. The surface temperature is
    determined by starting at the fixed temperature Trad and extrapolating from prad to the surface
    pressure ps using the atmosphere’s lapse rate, which is approximately governed by the appropriate
    adiabat. Since temperature decreases with altitude over much of the depth of a typical atmosphere,
    the surface temperature so obtained is typically greater than Trad, as illustrated in Figure 3.6.
    Increasing the concentration of a greenhouse gas decreases prad, and therefore increases the surface
    temperature because temperature is extrapolated from Trad over a greater pressure range. It
    is very important to recognize that greenhouse warming relies on the decrease of atmospheric
    temperature with height, which is generally due to the adiabatic profile established by convection.
    The greenhouse effect works by allowing a planet to radiate at a temperature colder than the
    surface, but for this to be possible, there must be some cold air aloft for the greenhouse gas to
    work with.
    For an atmosphere whose temperature profile is given by the dry adiabat, the surface temperature
    is
    Ts = (ps/prad)^R/cp*Trad. (3.8)
    With this formula, the Earth’s present surface temperature can be explained by taking prad/ps =
    .67, whence prad _ 670mb. Earth’s actual radiating pressure is somewhat lower than this estimate,
    because the atmosperic temperature decays less strongly with height than the dry adiabat.”
    Here is the link in the English version of Thiemes discussion point:
    http://freenet-homepage.de/klima/indexe.htm
    I think Thieme questions mostly the backradiation, everything else looks similar to Ray Pierrehumberts version.

    [Response: The adiabatic equation is fine. But the greenhouse effect determines what prad is. Theimes effectively just fixes it to the 'right' number without acknowledging that this would be different with different amounts of greenhouse gases. - gavin]

    Comment by Guenter Hess — 9 Sep 2008 @ 1:28 PM

  99. OK, I followed gavin’s links in #79. Haven’t got to the glacial-derived lower bounds yet, but did get sidetracked by this: The certainty of uncertainty

    That’s an RC article from last October that discusses this Science paper by Roe and Baker (RB): “Why is Climate Sensitivity so Unpredictable”, also published last October.

    In short, RB model the “climate feedback factor” f as a gaussian, and show that the climate sensitivity, which is proportional to 1 / (1-f), has a huge upper tail. Well, of course it does. Under their distribution f=1 has a non-zero probability which means there is a non-zero probability of infinite sensitivity.

    But this probability of high sensitivity is meaningless. Obviously, very large sensitivities are ruled out by the data (at least one person here is still alive). So at best they should be using a truncated gaussian. But even then, the distribution of f only makes sense if it implies a reasonable distribution on climate sensitivity, given everything we know. Whatever the individual uncertainties in various 20th century forcings, the range of plausible sensitivities is constrained by historical evidence from volcanic eruptions, changes in solar forcing, etc.

    In other words, it is not valid to use a large uncertainty in f to infer a long tail for climate sensitivity. At best you can say “we don’t know what the likelihood of high climate sensitivity is”. You can’t say “we’re confident the probability of high climate sensitivity is non-negligible”.

    So, what does RC say about the matter?

    What will get the most discussion in the popular press, of course, are the policy implications of Roe and Baker’s paper.

    For example, 450 ppm is an oft-cited threshold since this keeps deltaT below 2°C using standard climate sensitivities. But the skewed nature of the distribution of possible sensitivities means that it is much more likely that 450 ppm will give us more than 4.5°C of global warming rather than less than 2°.

    Read that last sentence carefully. Rather than saying “we don’t know what the probability of high climate sensitivity is”, they’re claiming “we’re confident the probability of high climate sensitivity is non-negligible”. And they want policy to reflect that.

    Wrong.

    [Response: You need to go a little further in your reading. We have never claimed here that these really high sensitivities are plausible. For instance here. You are welcome to criticise people for what they say, but do not attribute statements to me that I have not made, nor opinions I do not hold. I am very clearly on record as saying the the IPCC range (2 to 4.5 deg C) is what people should mostly be focussed on.- gavin]

    Comment by mugwump — 9 Sep 2008 @ 1:53 PM

  100. Mr. Mashey: a list of your favorite science books for non-science (i.e. liberal arts majors who don’t work in science but find it interesting) would be wonderful to have. I have a lot of good evolution and biology books, but none on global warming because I’m afraid of accidentally getting something outside the climatologists’ consensus.

    I enjoy reading realclimate but have no understanding of most of the concepts like forcing. I’m at a very basic level of understanding global warming – just the feedback effect of melting ice means less reflection of sun energy which speeds up melting of ice.

    Comment by TEBB — 9 Sep 2008 @ 2:03 PM

  101. re: #46 Wonderer

    Recall that I was *objecting* to the “senior engineers” generalization. Of course many have no problem sorting things out. My issue was to consider the different sorts of reasons why someone with some reasonable technical background doesn’t. We know about ideological and economic reasons, but I think there is this issue where:

    - Someone has a good technical background in one area
    - Over-generalizes from their specific area
    - Without understanding the differences

    Over-generalization is a common error we all have to fight against, but in particular, it tends to inhibit deeper insight, in this case, about who believes what, and why. One thing is clear: in explaining one’s own technical domain to somebody else, it *is* very beneficial to understand what they know, and then explain at the right level and ideally with examples that they can find relevant, and this is true whether the listener is technical or not. It’s also not easy, especially in cases when people simply do *not* want to learn.

    re: #58 rxc

    Again, I identified the issue that people overgeneralize from their own domains, you’ve just repeated that. The extraordinary claims is that all software has to be done the same way to be useful…

    I assume everyone knows that modern airplane design depends heavily on computers. Therefore, no one should *ever* fly until they’ve gotten complete disclosures of all source code, test cases, model runs, etc from Boeing, MacNeal-Schwendler, etc. Is that your conclusion? If so, the same applies to modern automobiles, especially given that the bulk of crash-crash tests are no longer in the real world, but in computers. So, don’t drive, either.

    re: #74 Lloyd Flack

    Thanks, I think the comment about discrete states versus continuous is especially useful.

    re: #89 TEBB

    Thanks. It isn’t just engineers looking at climate science, it’s the issue of someone with competence in one area over-generalizing what they know into another area. Interdisciplinary work is exceptionally productive, but usually, people who do a good job of it take time to learn a new domain, and then maybe bring their old expertise with them. (In the Great Wall of Science analogy I sometimes use, good interdiscplinary work means building bridges from one part to another, not rushing over to another piece with dynamite and trying to blow it up.

    Recall, that we’ve just had an example Once More Into the Bray, where the impetus seems to have come from a nuclear physicist & APS Fellow (Gerald Marsh) who’s on an anti-AGW crusade…

    So, it’s not just engineers :-)

    Comment by John Mashey — 9 Sep 2008 @ 2:23 PM

  102. Here in South Africa there’s a global warming skeptic who often makes his views known through his column for a local publication engineering news. I assumed that the scientific evidence for anthropogenic global warming was compelling and so I asked him for references that supported his view. I’ve been following the links, trying to look past the ad hominem attacks that appear to be an unfortunate characteristic of most blogs I have visited (from both “sides” of the debate) to see if there is indeed compelling support for anthropogenic global warming. I’m afraid I now find that I have done a 180 turn and now, if I have to label myself, put myself in the camp of the skeptics. The models may be (presumably are) built on good science but they are after all only models of a notoriously complex non-linear system with many assumptions not yet validated by actual experimental data. The models will only be validated by actual evidence from the future – until then they remain hypothetical. And as Douglass has shown the predictions of the models of an increase in the temperature of the upper layers of the atmosphere are not supported by the empirical evidence. i.e. the models fail to deliver. And how is the medieval warming from 1100 to 1250 rationalized? And the post-2003 ice core data that show (apparently – I have not seen the original references) no direct correlation between CO2 level changes and global temperature changes? The immediate reality of peak oil and the energy crisis is a much greater threat to our future well-being, IMHO, than the notion of extreme anthropogenic global warming purely based on unproven computer models.

    [Response: Douglass et are wrong on multiple levels and the data is starting to show what the models expect. Why do you think attribution of medieval warming is important for attribution of todays warming (equivalently why is working out why a forest burned 1000 years ago needed to work out why it burned this year, particularly when you saw the perpetrator walking away carrying a can of gas and some matches?). Your reference to 'post 2003 ice cores' not showing a connection is nonsense. I suggest you broaden your reading (start with the IPCC report - linked on the right). - gavin]

    Comment by Hugh Laue — 9 Sep 2008 @ 2:56 PM

  103. [edit]

    RE gavin at #99:

    We have never claimed here that these really high sensitivities are plausible.

    If by “we” you mean realclimate, then yes, you have made such a claim. This is a direct quote::

    But the skewed nature of the distribution of possible sensitivities means that it is much more likely that 450 ppm will give us more than 4.5°C of global warming rather than less than 2°.

    I don’t see how to read that as anything other than a claim that the sensitivity distribution of Roe and Baker has objective validity.

    You are welcome to criticise people for what they say, but do not attribute statements to me that I have not made, nor opinions I do not hold.

    I did not attribute the above statement to you personally. The linked RC article was authored by “group” – if you disagree with it then fine.

    [Response: You misunderstand both me and that statement. The statement you quote does not mean that more 4.5 deg C is likely, it just says that it is more likely than less than 2 - both numbers are small. - gavin]

    Comment by mugwump — 9 Sep 2008 @ 4:13 PM

  104. Re #95
    “I don’t see how that observation makes me a “pseudo skeptic”.”

    You remain wrong, however.

    Your insistence that you are not incorrect leads you to act appropriate to a skeptic. You can insist you aren’t but you are still acting in accordance with what such a mentality would manage.

    Comment by Mark — 9 Sep 2008 @ 4:14 PM

  105. “Now, GCM modelers don’t necessarily estimate the free parameters by directly inferring them from model behaviour”

    Says one thing: some of the free parameters are from microclimate modelling and other precise simulation techniques. Then you segue into

    “but they certainly adjust the parameters to get the model output to “fit” the 20th century instrumental record”

    which doesn’t allow any parameters to be what earlier was allowed to be extracted from simulation.

    Comment by Mark — 9 Sep 2008 @ 4:17 PM

  106. Re: 92
    I think the problem with your self-educative approach is that you have let your skeptic dictate what questions you investigate. It’s not just “all about the models,” as one skeptic of my acquaintance proclaims. 4 things we know NOT involving models:

    1) The warming trend is clear and well-documented (see for instance the 2007 Climate Report on the National Climate Data Center site);
    2) CO2 et al. have been demonstrated to absorb IR as advertised (as discussed on this site);
    3) Isotopic study confirms that the atmospheric CO2 is of fossil (ie., human-mediated) origin;
    4) Satellite data clearly show the stratosphere to be cooling; inexplicable except as a result of the greenhouse effect. (An “Inconvenient Truth” I have never seen show up on a denialist blog yet.)

    Comment by Kevin McKinney — 9 Sep 2008 @ 4:20 PM

  107. There are, to be sure, no quick and easy answers concering future climates. One needs only to look at figure 3-26 in Ruddiman’s “Earth’s Climate Past and Future”,showing the interactions involved in a 3-D GCM, to gain a sense of the complexity, and then add a time component to track these grid boxes over time.
    Yet, we learn to crawl before we walk. Which is why a post here about a year and a half ago,
    http://www.realclimate.org/index.php/archives/2007/04/learning-from-a-simple-model/
    though the assumptions do not quantitatively apply to the real world,is still a very useful introduction and will point anyone who wants to learn more about the physics behind climate modeling (including an engineer like myself)in the needed directions.

    Comment by Lawrence Brown — 9 Sep 2008 @ 4:37 PM

  108. Dear Gavin, Kevin, thanks for your considered responses. I have not been looking at just models but also trying to get a sense of the reliability of the data upon which the models are based. Anyway it is clear that I have much more research to do (including reading the IPCC report). I’m not sure I understand the response to my comment on the warm medieval period. Can you direct me to any papers that explain this in support of the GHG warming hypothesis? In other words, do any of the current models explain/fit the proxy data of the last 2000 years or however far back it goes?

    Comment by Hugh Laue — 9 Sep 2008 @ 5:05 PM

  109. What you do not state in your article is the level of inaccuracy and unreliability found in complex modeling. Very small adjustments to individual assumptions may have cascading and material effects on the outcomes. Further, it is too easy for scientists, regardless of how objective they believe they are, they are still subject to human frailties, to input biases that ultimately substantiate preconceived hypotheses. A common fallacy is to accept without question input assumptions whereby the model behaves as predicted but to thoroughly question input assumptions that provide results that contradict. The resulting is model bias.

    Comment by bsneath — 9 Sep 2008 @ 5:33 PM

  110. Hugh Laue (108) — There is now an Antarctic ice core record extending back about 800,000 years, I believe. However, the longest GCM run that I know about was done in Japan, providing a decent fit from 125,000 years ago to around 8,000 years ago; that’s from interglacial 2, the Eemian to interglacial 1, the Holocene climatic optimum.

    Regarding reading, on the side bar are listed several books (but I also include two by W.F. Ruddiman: “Earth’s Climate, Past and Future” and his popular “Plows, Plagues and Petroleum”). I’ve read several of these and I still find plowing through the IPCC AR4 though going.

    Comment by David B. Benson — 9 Sep 2008 @ 5:52 PM

  111. Gavin: so, what graduate-level textbooks are there that discuss the details of the Manabe-Wetherald paper and 1-dimensional climate models in detail? And i’ve already got a background in astronomy and physics, so i understand the modeling of stellar interiors and should be able to deal with the physics (although modeling convection may cause me to completely give up any hope of replicating Manabe-Wetherald, but i’d at least like to do the reading up until the point where I make that decision).

    [Response: David Archer's book, or Ray Pierrehumbert's draft are both good starts. Houghton's Physics of Atmospheres would be the next step up. - gavin]

    Comment by Lamont — 9 Sep 2008 @ 6:01 PM

  112. This thread seems to have been overtaken by several trolls.

    Gavin, as a regular Realclimate reader (though rarely posting), I admire your patience in responding to the same contrarian arguments and fundamental misunderstandings over and over again. Your work on this website is very useful for public education even if sometimes you feel you are stuck in a scene from the movie Groundhog Day.

    [Response: Sometimes even Kafka. - gavin]

    Comment by Chris McGrath — 9 Sep 2008 @ 6:17 PM

  113. @ #106

    “4) Satellite data clearly show the stratosphere to be cooling; inexplicable except as a result of the greenhouse effect. (An “Inconvenient Truth” I have never seen show up on a denialist blog yet.)”

    Try here:

    http://rankexploits.com/musings/2008/stratospheric-temperatures-bleg/

    [Response: Pretty sure lucia would bridle at your assumption. - gavin]

    Comment by woodentop — 9 Sep 2008 @ 6:42 PM

  114. re: #100 TEBB
    re: books
    I’d hardly claim to offer a definitive list of such, but you might look at that How to Learn About Science (Great Wall of Science)” post I mentioned.

    I think you are looking for Category B & C books in the bibliography included. Search for this section:

    “Critical Thinking:

    For general defense against disinformation of various sorts: BES2001, CAP1987, HUF1954, JON1995, KUR2001, MON1991, PAU1998, TUF1983.

    Scientists can believe strange things and stick with them: ARP1998, EHR2001, EHR2003

    Many people can believe really strange things FRA1986, GAR1981, GAR2000, PLA2002, RAN1986, SCH1994, some of which the originators believe, and some of which are hoaxes. Some retain belief even after the hoaxers show them how they did it.

    Starting from Scratch on Climate Science (B & C)

    If I had to pick one book to read, it would be RUD2005.

    Useful popular books are GOR2006, MAN2008, REV2006. Normally, I wouldn’t recommend a politician’s description of climate science, but in this case, it’s a well-presented, mostly-accurate equivalent of talks by many climate scientists. MAN2008 is a nice recent addition.

    One might go on to GRA1997.

    At some point, one should learn more of the history of this topic, via WEA2003, or through the first half of Naomi Oreskes’s video “The American Denial of Global Warming.” mentioned above. Many key basics of climate science are actually quite old.”

    Hope that helps.

    Comment by John Mashey — 9 Sep 2008 @ 6:45 PM

  115. Are not many of the parameters in the models statistically estimated? If so how can there be sufficient data of sufficient quality to both estimate the models and validate them.

    An additional question. If all the models are derived from the fundamental physics and are modeling the same earth climate system, why are they all different. This would imply that they are all incompletely specified, wrongly specified or are introducing arbitrary elements not derived from the science.

    [Response: Read this - it might help. - gavin]

    Comment by Chris Maddigan — 9 Sep 2008 @ 6:47 PM

  116. Wonderful to see Dave Letterman deliver his Simple Statement on global warming; Who knows better than a professional entertainer how to make a snappy presentation.

    Short 3 min video:
    http://www.huffingtonpost.com/2008/09/09/david-lettermans-global-w_n_125090.html

    Here is the CBS description of words he delivered:
    ( snipped to subject of AGW )

    ACT 2:
    Dave admits to being a little late getting on the global warming/climate change band wagon. When he first heard about it, he figured it was just a bunch of tree huggers making noise. But now, Dave’s eyes are wide open. Have we ever had this many hurricanes? Something is hinky. Our weather is going all screwy. And just a half hour ago, Dave learned a big huge ice chunk the size of Rhode Island just fell off the South Pole. People are finally now trying to do their part to stop our killing of the planet, but their efforts are meager and won’t amount to a hill of beans. People proudly say things like they are reusing party toothpicks to conserve. But it’s too late. We need to get the over abundance of carbon dioxide out of the atmosphere. And why isn’t anything being done? Because since 1980 we have had no leadership, no Republican, no Democrat has stepped forward and taken the lead. JFK inspired us to get to the moon in ten years. Today, every politician is afraid to talk about climate change because they don’t want to hurt the feelings of their big oil buddies. Dave cries out, “It’s too late! We are dead meat!”
    Dave exhales, then says, “Alright, let’s try to have some fun now.” But Dave can’t let go. He says climate change is no longer part of the Republican platform. “We are so screwed!” Dave says that if everyone in the world stopped driving and started riding bicycles, EVERYONE IN THE WORLD TODAY, if everyone stopped driving, due to the carbon buildup in our atmosphere, the planet would still continue to heat for the next 60 years. We are so screwed.

    ….
    ACT 3:
    Dave continues about climate change and how the polar bear will soon disappear. Dave exclaims, “In 6 years there will be no ice left on the peaks of the Rocky Mountains!”
    Paul, bewildered, asks, “Dave, what this ‘Polar Bear.’?” A crushed Dave mentions, “Ladies and gentlemen, this is sad. Paul Shaffer doesn’t remember polar bears.” And now due to the global warming and 68 degree Februarys, organisms are coming up from the earth and eating and destroying things. The fly beetle worm comes up from the ground and are eating pine trees. They are not supposed to do that. They are supposed to frozen to death in the deep freeze of winter, but there are no longer deep freezes. We are all screwed.

    ACT 6:
    THOMAS L. FRIEDMAN: the 3-time Pulitzer Prize winner has a new book, “Hot, Flat, and Crowded,” all about climate change, the earth, and what we are doing about it.
    Dave wants to know, “So, how are we doing?” Thomas says we are falling behind the curve. The weather is going to get weird; the hot will get hotter, the cold colder, the wet wetter, and the droughts longer. Worst case scenarios that were predicted for the year 2050 have now been accelerated to be here in 2012! The question is why isn’t someone in political power doing something about it? Where is this person? Friedman says the chant at the Republican Convention of “Drill, baby, drill” certainly isn’t the answer. Friedman says we are on the eve of an energy revolution and our Republican leaders are crying “Drill, baby, drill.” It would be like being on the edge of the internet revolution 20 years ago and chanting, “IBM Selectric Typewriters! IBM Selectric Typewriters!” We need to move beyond our 20th Century thinking. What we should be chanting is “Invent, baby, invent.” Friedman is not against new drilling, but we need to invest in renewables. His book, “Hot, Flat, and Crowded”:
    “Hot” refers to global warming.
    “Flat” refers to the rise of the middle class all over the world.
    “Crowded” refers to the rapid population growth. The earth will be home to another billion people in 12 years.
    Energy Technology (ET) is the next great revolution. Whatever country owns that industry will be the richest and most powerful. Energy Technology will create new renewable energy and nation building. The United States has to get involved all the way to lead the world in this. I think I mentioned this last time Friedman was here, or maybe it was Robert F. Kennedy Jr . . . . . to make energy technology really appealing, we have to stress the money that can be made from this. It’s the financial interests that will get the ball rolling. People pretend they are interested in saving the Earth, but what they really want is money and a second home on the beach. Once it is realized how much money is to be made in new energy and how it will make all Americans richer, then we will get behind it.
    Thomas Friedman – always interesting, and speaks a language we can all understand.M
    “Hot, Flat, and Crowded” – it’s in stores now, and if you turn on your TV right now, you’ll probably see him. He’s all over the place this week.
    “Invent, baby, invent!”

    And that was our show for Monday September 8, 2008.
    http://lateshow.cbs.com/latenight/lateshow/wahoo/index/php/20080908.phtml

    Comment by Richard Pauli — 9 Sep 2008 @ 8:03 PM

  117. RE #103:

    [... The statement you quote does not mean that more 4.5 deg C is likely, it just says that it is more likely than less than 2 - both numbers are small. - gavin]

    Actually, the statement by RC says 4.5C is “much more likely” than 2C (not just more likely), and it also says “…the skewed nature of the distribution of possible sensitivities…”, which would imply that RC endorses the functional form of Roe and Baker’s distribution (the skewed sensitivity distribution is, after all, the whole point of the paper).

    In fact, according to Roe and Baker, the expected sensitivity is infinite, so the probability that it is greater than 4.5C is 1.

    [Response: Well, since that is obviously nonsense perhaps you should go and read the paper again. - gavin]

    Comment by mugwump — 9 Sep 2008 @ 8:16 PM

  118. Actually, the probability that the sensitivity is greater than 4.5C is not unity under Roe and Baker (my mistake), but the expected sensitivity is infinite, and the greater than 4.5C probability is around 10 times the less than 2C probability.

    Comment by mugwump — 9 Sep 2008 @ 8:40 PM

  119. Gavin, I have a question:
    According to Figure 4.12 from Ray’s book,
    http://geosci.uchicago.edu/~rtp1/ClimateBook/ClimateVol1.pdf
    Eye-balling the general slope of the absorption function, doubling of CO2 leads to a shift in optical thickness boundary of the main CO2 absorption band by about 7 cm-1 (the other edge is covered by water vapor, so I neglect it), correct? The OLR flux at the edge is about 0.3 W/m2/cm-1, according to upper picture. Therefore, the CO2 doubling reduces the flux through atmospheric window by 0.3*7= 2 W/m2. Would it be a fair order-of-magnitude estimation of 2xCO2 “radiative forcing”?

    [Response: No. The forcing is about 4 W/m2 calculated much more accurately. -gavin]

    Comment by Al Tekhasski — 9 Sep 2008 @ 8:43 PM

  120. I’ve been following the AGW debate for a few years and have concluded there is a great deal of reasonable doubt about the suggestion that increasing/decreasing CO2 concentration by humans has a high impact on climate.

    As a result I would not support expensive actions to manage CO2. I believe my view is shared by an ever growing number of people world wide.

    [Response: Believe what you want. Doesn't make it true. - gavin]

    Comment by Hudibras — 9 Sep 2008 @ 9:26 PM

  121. mugwump,

    Where in Roe and Baker did you get the impression that they say a sensitivity is greater than 4.5 C?? At best, you can argue that a sensitivity greater than 4.5 C is unlikely but the possibility is not zero.

    Comment by Chris Colose — 9 Sep 2008 @ 9:40 PM

  122. Please delete last post (got cut off from symbol use)

    mugwump,

    your interpretation is still incorrect. The temperature response is the product of the sensitivity and the forcing (neglecting efficacy), so if one of those terms goes to infinity, then the temperature goes to infinity.

    What you may mean is that you can model the climate response as an infinite power series (e.g., 1 + f + f^2 + f^3…f^n) where f is the feedback factor and is between zero and one. Thus, a small change in ‘f’ produces a a larger change in the overall response, which is why we may not see a singificant reduction in the 2 to 4.5 C range.

    Comment by Chris Colose — 9 Sep 2008 @ 9:48 PM

  123. @101 Your response to rxc is nonsense. He certainly was not claiming that he requires the code from Boeing. I find it hard to believe you don’t know this. I expect that all the models used to make an airplane to be rigorously documented and I trust that they are. I don’t need to see that documentation because the FAA has experts to examine them and Boeing has lawyers that understand liability laws. I don’t expect to be supplied those documents because when Boeing is asked by authorities, they don’t don’t do any of the following:

    1) Averaging models. If Boeing said that we used two models to make this airplane: one predicted a crash by a hard turn to the left and the other predicted a crash by a hard turn to the right so on average it flies straight, I wouldn’t get on the plane. I’ve never heard of the notion that averaging models gives an answer close to the truth is “confirmation.” It generally means one or both models are wrong.

    [Response: BS. Every weather or hurricane forecast is from an ensemble. Any conclusion from a complex turbulent flow is drawn from an ensemble. - gavin]

    2) The general tone of the AGW people is exceedingly shrill. The world will end and the science is settled. There’s consensus, etc… Shrill is something many senior engineers have seen coming out of the mouths of people who were obviously – and later proven to be – wrong. Quite often.

    [Response: The tone of people like you misrepresenting the science is extremely arrogant. That's definitely a sign that's something wrong and that their opinions are rooted in politics. Quite often. - gavin]

    3) I haven’t seen it recently but I have seen plenty of jibberish about how it can only be understood by climate scientists. This is a red flag. If you expect me to put resources against your problem, you had better be able to say why. Saying that the risk is too big if we don’t is not enough. Saying that it is encoded in a model that should determine public policy but is not available to the public is not enough.

    [Response: Let me get this straight. You come to website that has working climate scientists trying to explain climate science to the public and you think are only point is to tell people they can't understand? Please go back and think about it for more than a microsecond. - gavin]

    4) Psychoanalysis. I can’t imagine why you guys think this is convincing to anyone. This may shock you but senior engineers have seen people replace technical arguments with an analysis of the motives of their opponents many, many times. It isn’t the sign of a serious scientist. It’s the sure sign of a politician.

    5) “It’s basic physics – until it isn’t.” To listen to some parties, AGW is as fundamental as Newtonian physics, basic thermodynamics, the Copernican solar system and evolution. “But if you want to get reliable numbers – if you want to know whether raising the level of greenhouse gases will bring a trivial warming or a catastrophe – you have to figure in humidity, convection, aerosol pollution, and a pile of other features of the climate system, all fitted together in lengthy computer runs.” If the quote is true stop comparing people who don’t believe AGW to people who don’t believe in some basic science. Basic science can be taught to teenagers. AGW apparently involves computer runs not available to everyone.

    [Response: The greenhouse effect is basic science. The increases in GHGs due to human activities is basic science. The observed warming is basic science. The attribution of warming to increasing GHGs is slightly more sophisticated science. The long term impacts of further increases is sometimes clear, sometimes uncertain. People discussing the latter two are in a completely different category than people disputing the first three. - gavin]

    When Boeing and MD starts doing any of the above in response to FAA requests, I’ll start buying bus tickets. When AGW people stop all of the above, I’ll begin to consider the possibility that the science is settled or that there’s any science at all.

    [Response: Settled/not settled is false binary distinction much beloved of certain think-tanks. Nothing is ever completely settled, and yet much is understood. - gavin]

    Comment by Joe Triscari — 9 Sep 2008 @ 11:17 PM

  124. RE #117:
    I said:

    “In fact, according to Roe and Baker, the expected sensitivity is infinite”

    Response: Well, since that is obviously nonsense perhaps you should go and read the paper again. – gavin]

    Ok, derivation:

    Roe and Baker (RB) model the “total feedback factor” f as a gaussian. Climate sensitivity is proportional to 1 / (1-f). Obviously, we should bound f to be between 0 and 1, which a gaussian doesn’t do, but we can just truncate and rescale to get a valid probability density function for f, which will still have the functional form of a gaussian for f between 0 and 1.

    So, the expected climate sensitivity is proportional to the integral from 0 to 1 of 1 / (1-f) * N(f), where N(f) is proportional to the normal density function.

    Now, since f is bounded between 0 and 1, N(f) is bounded below by some number greater than zero, call it K. Hence the expected climate sensitivity is greater than some (positive) constant times the integral from 0 to 1 of 1 / (1-f).

    But the integral of 1 / (1-f) is -log(1-f). So the integral from 0 to 1 of 1 / (1-f) is -log (1-1) + log (1-0) = -log(0) = infinity.

    Thus, the expected climate sensitivity under the RB model is infinite.

    [Response: The mean of a Cauchy distribution is undefined, but the probability of lying between any two values is bounded (and less than 1) - it is therefore your corollary that is nonsense. -gavin ]

    Comment by mugwump — 9 Sep 2008 @ 11:23 PM

  125. DOE has announced that it will provide 10 million hours of time on three of its most powerful computers (ANL, ORNL, LBNL) to NOAA for further exploration of advanced climate change models. This should help improve the models as well as help the scientists understand both the models and the climate change effect.

    http://www.doe.gov/news/6517.htm

    Comment by Dan O\'Donnell — 10 Sep 2008 @ 12:39 AM

  126. rwx writes:

    I am not a climate scientist, but if the climate scientists cannot produce documentation like this, then their claims cannot be believed.

    I agree. And if physicists can’t produce that kind of documentation for computer models of gravitons, I think we can forget about that silly “gravity” business.

    Comment by Barton Paul Levenson — 10 Sep 2008 @ 4:14 AM

  127. Re #120, Well now that you have found real climate you should read some of the articles and then form a more objective opinion. Maybe you can start by mentioning the things that stop you from supporting action on CO2 emissions?

    Comment by pete best — 10 Sep 2008 @ 4:22 AM

  128. mugwump writes:

    Now, GCM modelers don’t necessarily estimate the free parameters by directly inferring them from model behaviour, but they certainly adjust the parameters to get the model output to “fit” the 20th century instrumental record (it could be argued that the fit between models and the 20th century instrumental record is the sine qua non of progress in the field).

    GCMs are not statistical models fit to climate time series. When a GCM is revised, it’s because a better way has been found to represent some part of the physics.

    You were told this, in some detail, on Deltoid a couple of months ago. Yet you keep repeating the same false idea. Why is that?

    Comment by Barton Paul Levenson — 10 Sep 2008 @ 4:25 AM

  129. Yoron writes, quoting a math site:

    Non-linear equations can be extremely difficult, and are often insoluble.

    You can’t conclude from that that you can’t use those equations. Equations which can’t be solved can often be made to give highly accurate answers by means of numerical successive approximations or integrations. An example would be the Planck law for radiation — it’s an integral that can’t be solved. Nonetheless, it’s easy to write a numerical integration that can give you answers accurate to as many decimal places as you want. Even I’ve done it.

    Comment by Barton Paul Levenson — 10 Sep 2008 @ 4:29 AM

  130. TEBB posts:

    a list of your favorite science books for non-science (i.e. liberal arts majors who don’t work in science but find it interesting) would be wonderful to have. I have a lot of good evolution and biology books, but none on global warming because I’m afraid of accidentally getting something outside the climatologists’ consensus.

    Some good popular discussions with very little math are:

    Philander, George S. 1998. “Is the Temperature Rising?”

    Weart, Spencer 2003. “The Discovery of Global Warming.”

    Both are thoroughly within the scientific consensus.

    Comment by Barton Paul Levenson — 10 Sep 2008 @ 4:34 AM

  131. well 100 years of warming, doesnt mean it has been cooling over the last 3 million years either

    .. for instance why did the lenght of the glacials increase from 40 kyears tot 100 kyears
    in the last million years..

    we actually need this warming.. nobody likes glacials

    Comment by rutger — 10 Sep 2008 @ 4:36 AM

  132. What if you had overlooked something really important.
    http://hps.elte.hu/zagoni/Proofs_of_the_Miskolczi_theory.htm
    I don’t really expect an answer but i found this paper more than appropriate to raise serious doubts about the validity of the explanations you usually give.
    Best regards.

    [Response: Curious. You'd rather believe that an obscure website has found a trivial algebraic flaw that thousands of other scientists had missed. A flaw so fundamental that all radiative transfer calculations - including those for all the satellite imagery people are using to track sea ice and hurricanes - must be wrong. Hmm.... or Miskolczi made a mistake in his derivation. Which is more likely? - gavin]

    Comment by Mystified — 10 Sep 2008 @ 4:47 AM

  133. Isn’t there any step between a simple equation, and the global climate model? Like, is there modelled something simplier, like how is the air column cooling on a windless night in spring time vs autumn time when the CO2 concentrations are slightly different? Or compare what are the results now, and what they will be in 5 years with higher CO2 levels?

    [Response: There are intermediate steps - like radiative-convective models used by Manabe, but simple weather or seasonal analogs? no. - gavin]

    Comment by andy — 10 Sep 2008 @ 5:48 AM

  134. Hubris, #120. What makes you reach that conclusion? What poll did you make concerning the worldwide opinion? Why would the world think the same as you?

    Comment by Mark — 10 Sep 2008 @ 5:50 AM

  135. re: Gavin’s response to 109.

    Actually, 2 is well within an order of magnitude of 4.

    So is 10.

    The difference Al gets is because he’s not *accurately* accounting for the change, but it is about right. What may also be a problem for his calculation is igoring the water-bound side of the equation. With both CO2 and H2O obstructing IR radiation, the height above sea level of one optical depth at that wavelength is a lot higher. Higher = cooler. Cooler = less re-radiation and so more entrapment of that energy within the system.

    This is why “IR bands are completely absorbed already so more CO2 can’t affect the warming” is wrong.

    ‘course this breaks down in ways stellar physics doesn’t bother with when the optical depth passes beyond the tropopause.

    Comment by Mark — 10 Sep 2008 @ 6:12 AM

  136. re: # 84

    rxc was referring to computer programs developed for analyses of physical phenomena and processes occurring within the reactor; not to the equipment components themselves. The models must predict the response of the entire nuclear steam supply system under both steady operating conditions and off-normal upset conditions.

    By the same token, the design of the equipment components must take into account the responses of the system. The components must be designed to accommodate the most extreme conditions anticipated to occur during the life of the system. And this is true for almost all engineered systems in general. The result being that in fact this statement is not correct:

    Each piece put into a nuclear power plant is specified in advance and you know what it’s made of.

    And the statement is a mischaracterization of the design process for almost all engineered systems; passenger aircraft, space shuttles, bridges, elevators, etc. being additional important examples.

    Also re: #101

    Therefore, no one should *ever* fly until they’ve gotten complete disclosures of all source code, test cases, model runs, etc from Boeing, MacNeal-Schwendler, etc.

    Complete disclosures of all the information listed has occurred and been subjected to Independent Verification and Validation and other Software Quality Assurance procedures and processes. These activities are required by federal law, and equally important, are simple good sense for all calculations that have potential impact on the health and safety of the public.

    Do you seriously think that any aircraft or automobile company would not follow the most strict requirements necessary to ensure the reliability of all calculations?

    The problem, and it’s a critically significant problem, is that we continue to read that Climate Science seems to be the only community that cannot preform these fundamental tasks. While at the same time the calculations performed by the community have the potential to impact the health and safety of almost everyone on the planet.

    [Response: Dan, repeating the same thing a hundred times does not make it true. If you want to campaign to double the climate modeling budget so that we can employ an extra thousand code checkers, great. But funders have made it absolutely clear that they prefer that we focus on science (we are scientists after all) and that no doubling of funding is on the horizon. The priorities are reversed in the nuclear industry and so they do things differently. If you want to check though our code, go ahead - nightly snapshots of our repository are on our website. All bug fixes will be greatly welcomed! In the meantime, feel free to ignore all modelling and instead base your projections on the tea leaves in your cup. - gavin]

    Comment by Dan Hughes — 10 Sep 2008 @ 7:39 AM

  137. mugwump, #124.

    If ANY point in that distribution has a probability of 1, then EVERYWHERE ELSE MUST be zero.

    Hence your mathematics is incorrect.

    Either there’s no “1″ and your log(0) isn’t there, or there IS a 1 and the average is the same as the value in the range that has a probability of 1. Which still isn’t infinity.

    Hell, I’ve only got an A level in maths and I know what you did was wrong!

    Comment by Mark — 10 Sep 2008 @ 8:21 AM

  138. You know, with the skeptics the debate reminds me a lot of the one surrounding smoking.

    You can’t prove smoking causes cancer. After all, people got lung cancer before cigarettes became widespread. And the correlation between smoking and lung cancer is just that, a correlation.

    I could even point to the fact that lung cancer rates in France weren’t much higher than those in areas of the U.S. where smoking was much more restricted.

    As late as the 1980s cigarette companies were saying that smoking wasn’t a problem. They could even trot out a few experts to say so.

    But I don’t think any serious person thinks smoking is not unhealthy and a major cause of many diseases.

    The real issue, I think, is that there’s a segment of the population that is very., very unhappy with certain changes in their lifestyles that are just going to have to happen. Sorry, you just won’t be able to drive a Hummer that gets 2 gallons to the mile. What is interesting to me is that it is just so important to have consumer “choice” that all else takes a back seat. Mugwump’s comment about environmentalists’ political goals was revealing. What are those goals exactly?

    The fact is there are plenty of economic controls we accept because we (as a society) decided that not doing so would be really, really bad. We control the labor market by not allowing slavery, for instance, because even though it might be terribly efficient for certain industries (it actually worked pretty well for agriculture) the human cost is just too high. Similarly, we don’t allow cigarettes to be sold via vending machines in grade schools. Nor do we allow factories to simply poison their workers with the idea that if the products are cheaper, then it’s ok.

    Maybe it’s because I live in New York, and haven’t owned a car for some time. I do not see the need to have an SUV. I’m quite confident in my manhood, thanks. (I used to drive a 1986 Toyota Corolla, which gave me fantastic gas mileage for more than a decade — a tank got me all the way from Boston to Rochester, NY on a regular basis).

    Or maybe I’m too cautious. I figure if there’s a nonzero chance, say 30%, that sea levels will rise enough to put a chunk of Florida real estate underwater, than it is far better to take steps that are sort-of-pricey-but-manageable now than to wait until things get even more expensive. I’m weird, perhaps, or maybe it’s a tree-hugger thing.

    Many “skeptics” don’t seem to think CO2 can have that big an effect. Let me remind some of them that it was the work done studying the atmospheres of other planets as well as the Earth that helped get the science going. Want to see how big an effect CO2 has? Go to Mars. The blackbody temperature of the planet should be something around 228 K (I haven’t done the math in a while, if someone has a better answer let me know, I used the minimum solar distance for Mars). It bottoms out at about 180 K when the planet is farther from the sun.

    But the actual average temperature on Mars gets as high as 273 K. It averages just above the 228 K figure, usually in the 230s.

    That’s because Mars has an atmosphere, made up almost entirely of CO2. It’s really, really thin — on the order of 1% of Earth’s.

    Yet the CO2 provides measurable forcing to Mars’ climate.

    Now, if such a small amount of CO2 provides that much heating over there, then it’s reasonable to assume that it does so here as well. Even in small concentrations CO2 packs a bit of a punch. Mars has much more CO2 in its atmosphere than Earth does (pound for pound as it were), of course, but it’s notable that with an atmosphere that’s near nonexistent you get 2 degrees K or thereabouts. That’s a lot.

    To compare, the Earth’s temperature “should” be 255 K, or -18 C on average. The water vapor, oxygen, CO2, methane, and nitrogen all make it much balmier than that. And calculating the effects of each of these is complicated, but not that hard (assuming a static atmosphere). Space scientists have done this for years to make sure the models for other planets were in the ballpark and to get an idea what to look for.

    Dump gigatons of CO2 into the atmosphere faster than it can cycle out and it seems elementary you’d get a huge effect in short order (on geologic timescales). The simplest thing to do is stop doing so much of it. After all, we don’t use rivers as open sewers in most places anymore, even we could always find another river, and some rivers are so big that we could never pollute them to the point where they do things like catch fire, right?

    Comment by Jess — 10 Sep 2008 @ 8:29 AM

  139. Thanks for the very reasoned response, gavin.

    The use of “an extra thousand code checkers” totally validates my characterization of the Climate Change Community to these issues.

    To begin the task that you suggest that I do, I need the documentation for the model equations and the coding. Can you point me to the peer-reviewed publication that contains the final form of the continuous equation for the momentum balance in the vertical (radial) direction and the as-coded form of the discrete approximation to that equation? I’ll need additional documentation for the actual coding.

    [edit]

    Thanks again.
    Thanks

    [Response: Sure. Send me enough money so that I can employ a programmer and documentation expert to make it easy for you. Or you could actually try looking up references. If that isn't possible check out NCAR's code instead. They have about 4 times the staff we do and a very nice online manual. - gavin]

    Comment by Dan Hughes — 10 Sep 2008 @ 8:48 AM

  140. Dan Hughes, so where is the code used to discover the Top quark archived? How about the code used to analyze the seismic waves and come up the Earth Reference Models? How many people pored over that code looking for errors outside of the groups that did the work?
    Science works by different rules. Analyses are conducted independently, and it is critical to the methodology that the analysis methods–including code–remain independent as much as possible.
    The problem in my opinion is that we don’t have a discipline of “climate engineering,” that is people whose job it is to figure out how bad things COULD become and figure out if we can engineer against them or if the consequences are so severe that the threat must be avoided. This is the case in part because the potential impacts of climate change are so wide-ranging that such an engineering effort is daunting indeed. More to the point, we don’t have such an effort in place because many continue to deny very cogent scientific evidence that climate poses a credible threat. Maybe if you and your buddies accept the science, we can get your wish of reliable engineering code to mitigates climate change.

    Comment by Ray Ladbury — 10 Sep 2008 @ 8:58 AM

  141. [Response: The mean of a Cauchy distribution is undefined, but the probability of lying between any two values is bounded (and less than 1) - it is therefore your corollary that is nonsense. -gavin ]

    Oh come on, I corrected the corollary immediately at #118. Both #117 and #118 were in the moderation queue simultaneously, so you knew I had corrected it when you made your “nonsense” claim.

    And it is worse than “the mean of a cauchy is undefined”. A cauchy distribution is symmetric. It has no mean because the mean is essentially “infinity – infinity” which is undefined. Roe and Baker’s model is not a cauchy. It is not symmetric (that’s the point). In fact the probability of sensitivities less than 1.2C is zero under their model (assuming feedback is positive). The expected climate sensitivity under their model is simply infinity (not “infinity – infinity” = undefined).

    Back to your comment on the realclimate article on Roe and Baker:

    [... The [realclimate] statement you quote does not mean that more 4.5 deg C is likely, it just says that it is more likely than less than 2 – both numbers are small. – gavin]

    RB estimate the parameters of their distribution from a bunch of published GCM studies. They come up with a number of different values, but they are all clustered around a mean f of 0.7 and a standard deviation of 0.14.

    Plugging those numbers into their distribution, I get a 40.5% probability of climate sensitivity greater than 4.5C, and a 0.27% probability of climate sensitivity less than 2C. That makes 4.5C 150 times more likely than 2C under their model, and certainly not “small” as you claim. [plugging in other values for mean and standard deviation from their paper yields similar results]

    Comment by mugwump — 10 Sep 2008 @ 9:36 AM

  142. Ray Ladbury, where are the applications of the results of the code used to discover the Top quark that have the potential to impact the health and safety of the public?

    Lacking any such applications, I don’t need to know anything about that code. Or any other codes the results from which do not have the potential to impact the health and safety of the public.

    [Response: Ah, but someone has claimed the LHC will destory the world. The consensus of scientists says it's bollocks. But have you examined their code? checked their calculations? examined their assumptions? Why not? Surely the whole world is at stake! - gavin]

    Implementation of mitigation and adaptation policies is truly ‘above my pay grade’ for all my buddies and me. That power resides in Washington DC for the USA.

    Comment by Dan Hughes — 10 Sep 2008 @ 9:37 AM

  143. “Actually, the probability that the sensitivity is greater than 4.5C is not unity under Roe and Baker (my mistake), but the expected sensitivity is infinite, and the greater than 4.5C probability is around 10 times the less than 2C probability.” – mugwump@118. This is of course entirely compatible with what Gavin has been saying: the >4.5C probability can be as small as you like, as long as it is positive, and the statement you make still be true. That’s the thing about highly skewed distributions: the mean doesn’t tell you much, and can be infinite.

    Comment by Nick Gotts — 10 Sep 2008 @ 9:44 AM

  144. RE #137 Mark,

    If ANY point in that distribution has a probability of 1, then EVERYWHERE ELSE MUST be zero.

    Nowhere in the distribution has a probability of 1. The expectation is the integral of the climate sensitivity multiplied by the probability density function (pdf). The pdf stays well defined, but the climate sensitivity diverges as 1 / (1-f) as f approaches 1. It’s what those in the theoretical physics world call a “logarithmic divergence”.

    Hell, I’ve only got an A level in maths and I know what you did was wrong!

    I’ll see your A level and raise you a PhD plus a bunch of publications :-)

    Comment by mugwump — 10 Sep 2008 @ 9:49 AM

  145. Well count me as another Senior Engineer and skeptic. And involved in modelling as well (LS-DYNA). What really makes me laugh is the assumption that, what with being unable to provide a direct physical exposition or any empirical proof, the burden falls on skeptics to show why we shouldn’t act.

    Here’s a little tip for those of you lounging in comfort in the ivory towers of academia – abundant fossil fuels underpin our entire civilisation and your entire way of life. You eat well, live well and have the life expectancy and health that you do entirely because of free and available energy. The burden is on you to provide the proof. And if I hear one more fool hand-wave the issue away with some comment about ‘solar and wind’, I will lose it. To answer Ray Ladbury’s “Doctor, Doctor, it hurts when I raise CO2 levels.”, I say it doesn’t hurt one whit. He can go back to the 1700s and then we’ll see how much it ‘hurts’. Unless he defines ‘hurt’ as ‘comfort, wealth and prosperity’.

    Am I worried about global warming? A little, if at all. One hundred and fifty years ago, no cars, no planes, and oil was a black ooze that came out of the ground, not an energy source. Sixty years ago, uranium was a funny rock that glowed in the dark. A hundred years from now, who knows? And who cares? Do you think that if some genius modelled the greenhouse effect in 1750, then advised everyone not to bother with the industrial revolution in order to ‘save their children’, we’d be better off now?

    Personally, I advise you all to grow a pair, and have some faith in the human race. Full steam ahead. Let us grow wealthier, stronger, and smarter, and *if* this turns out to be a problem we’ll solve it.

    Comment by Greg — 10 Sep 2008 @ 9:50 AM

  146. RE #28:

    GCMs are not statistical models fit to climate time series.
    When a GCM is revised, it’s because a better way has been found to represent some part of the physics.

    Why do they revise the models? Because they don’t explain all the data (including climate time series, but also all other kinds of data).

    Do you really think when the scientists are revising the models they do so only on the basis of some abstract physical first principles? Of course they don’t. And nor should they. They look at the data, construct physical models to explain the data, test the models, make modifications, etc etc. That’s the way it works and that’s the way it should work.

    Outside of pure thought exercises like Einstein’s general relativity, there is very little science that is not data driven.

    You were told this, in some detail, on Deltoid a couple of months ago. Yet you keep repeating the same false idea. Why is that?

    Gee, I don’t know BPL, maybe because it’s not false.

    [Response: No. It is. The data and ideas that come in to drive changes in code are generally based on single processes - the scattering of aerosols, the thermodynamics of ice, the rheology of ice sheets, the impact of eddies in the ocean, and observations over a very small number of years or campaigns. The idea is that by improving the physics at the level of the single process, the emergent properties improve also. So far, and for most processes this turns out to be a correct assumption. It is not the same as twiddling with processes in order to improve the fidelity to the 20th century trend. Read Schmidt et al (2006) (and I know it's dull), but point to me the code changes and development that are discussed that have anything to do with trends. You won't be able to, because that isn't how it works. - gavin]

    Comment by mugwump — 10 Sep 2008 @ 9:59 AM

  147. Greg says: “Well count me as another Senior Engineer and skeptic. And involved in modelling as well (LS-DYNA).”

    OK, I’ve incremented my ignorant-food-tube tally by one.

    As to his suggestion: “Personally, I advise you all to grow a pair,…”

    Personally, I’ve never seen much advantage in growing man-teats, Greg, but I hope you’re happy with yours.

    Comment by Ray Ladbury — 10 Sep 2008 @ 10:15 AM

  148. “Personally, I advise you all to grow a pair, and have some faith in the human race.” Greg@144

    Greg, when making important decisions about the future of humanity, I consider reason a better guide than faith, and brains more useful than testes.

    Comment by Nick Gotts — 10 Sep 2008 @ 10:39 AM

  149. Greg,

    There’s a rather large amount of empirical proof of AGW. You could start with http://www.skepticalscience.com/empirical-evidence-for-global-warming.htm.

    It’s odd to demand proof from the scientists on warming and offer only faith in the human race in return.

    Comment by walter pearce — 10 Sep 2008 @ 10:40 AM

  150. Greg wrote: “What really makes me laugh is the assumption that, what with being unable to provide a direct physical exposition or any empirical proof, the burden falls on skeptics to show why we shouldn’t act.”

    There is both a “direct physical exposition” and abundant empirical evidence for anthropogenic global warming. You are merely proclaiming your ignorance of these facts, which is not particularly interesting.

    Greg wrote: “abundant fossil fuels underpin our entire civilisation and your entire way of life. You eat well, live well and have the life expectancy and health that you do entirely because of free and available energy”

    In what reality are fossil fuels “free”? Have you noticed that you have to pay for gasoline at the pump, they don’t give it away?

    On the other hand, wind and solar energy are actually free, far more abundant than fossil fuels, and won’t run out on any time scale that is important to human civilization.

    Greg wrote: “And if I hear one more fool hand-wave the issue away with some comment about ’solar and wind’, I will lose it.”

    You are evidently as ignorant about energy as you are about climate science. The USA has vast commercially exploitable wind and solar energy resources, more than enough to produce several times as much electricity as the entire country uses, with today’s technology.

    Comment by SecularAnimist — 10 Sep 2008 @ 10:44 AM

  151. I’m back catching up on reading comments today and as a layman looking at this entire thread or whatever it is called, this comment by Mashey seems to hit the nail on the head as to what I see happening here from a human behavior standpoint:

    “…good interdiscplinary work means building bridges from one part to another, not rushing over to another piece with dynamite and trying to blow it up.”

    It seems Mugwumps and others are strong in their own respective fields and making the mistake of rushing over to climatology with dynamite, instead of taking the time to learn climatology.

    Comment by TEBB — 10 Sep 2008 @ 10:45 AM

  152. Apologies to all engineers! I only meant to report an observation: of those who emailed me complaining that they couldn’t find a simple scientific calculation of the greenhouse effect on the Web, a surprisingly high fraction identified themselves as engineers, and senior in the sense of “experienced,” “retired” or the like. This is a small sample and certainly not characteristic of the vast majority of engineers.

    Comment by Spencer — 10 Sep 2008 @ 11:10 AM

  153. RE #143:

    Nick Gotts, my original estimate was from eyeballing the graphs in Roe and Baker. I actually calculated the answer at #141. There’s no way to justify Gavin’s interpretation that > 4.5C and < 2C are both unlikely under their model. > 4.5C has probability of 0.4. < 2C has probability 0.0027.

    Comment by mugwump — 10 Sep 2008 @ 11:18 AM

  154. Gavin #146:

    The idea is that by improving the physics at the level of the single process, the emergent properties improve also.

    Sure, hence why I said “Why do they revise the models? Because they don’t explain all the data (including climate time series, but also all other kinds of data).”. The “all other kinds of data” I was referring to includes the single processes.

    It is not the same as twiddling with processes in order to improve the fidelity to the 20th century trend.

    As above, I am not claiming that is exclusively what is done.

    Read Schmidt et al (2006) (and I know it’s dull), but point to me the code changes and development that are discussed that have anything to do with trends. You won’t be able to, because that isn’t how it works. – gavin]

    Come on gavin, you’re not seriously claiming that you don’t have a fraction of an eye on temperature trends when you’re modifying the models? If a model change produces a cooling trend over the 20th century, you never take a second look?

    [note, there is absolutely nothing wrong with doing this. It would be wrong not to use all the available data to verify the models.]

    [Response: But the warming over the 20th Century is driven by the forcings, not the model. If suddenly CO2 in the model ceased to absorb IR, then obviously we'd examine what changed. But that would radically affected present day climate independently of what implication it would have on the trend. Almost all of the changes described in the paper would presumably affect the sensitivity, but we generally don't even test that after every little change. The changes are driven by the fidelity to the process, and the emergent properties of the present day climatology (strength of the Hadley circulation, percent ice cover, cloud radiatve forcing). Really, the models are not tuned to the trend. - gavin]

    Comment by mugwump — 10 Sep 2008 @ 11:27 AM

  155. Greg, that was the funniest parody rant I’ve read in a long time.

    What, you were serious?

    Comment by Jim Eager — 10 Sep 2008 @ 11:48 AM

  156. Mugwump, You seem to be confused as to the type of modeling that is going on here. Climate models are not statistical, but dynamical. Statistical models parameters are fit to the data to achieve best agreement. Parameters in dynamical models are determined independently, and then the performance of the model is judged by how well it reproduces the trends seen in the data. If you do not understand this distinction, you will reach incorrect conclusions about the reliability of the process.
    The reason the distribution over possible sensitivities is asymmetric is because the data are more forgiving of a high sensitivity (>4.5 degrees per doubling) than of a low one (

    Comment by Ray Ladbury — 10 Sep 2008 @ 12:42 PM

  157. Re #144

    “Nowhere in the distribution has a probability of 1. The expectation is the integral of the climate sensitivity multiplied by the probability density function (pdf). The pdf stays well defined, but the climate sensitivity diverges as 1 / (1-f) as f approaches 1. It’s what those in the theoretical physics world call a “logarithmic divergence”.”

    As f approaches 1. Never gets there. Hence your log 0 is incorrect.

    [edit]

    Comment by Mark — 10 Sep 2008 @ 1:02 PM

  158. Further to Nick at #143.

    Consider a distribution that has 3.5 as its mean. The dice roll of a valid d6. Flat. And goes no further than 6.

    Now take a Poisson distribution. It may also have a 3.5 mean. But it has a fat tail.

    Now when it comes to CO2 sensitivity, it cannot be negative. There’s very much an absolute minimum. What’s the maximum? Well, unless you model it, infinite, really. So the modal, mean and average are all different and tell you something.

    If CO2 sensitivity is low, then our actions have a smaller but still detrimental effect. If it is high, the detrimental effects happen quicker and are more widespread.

    So the “harm” is not linear as sensitivity goes up but goes up exponentially.

    So, instead of looking for the minimum change, you’d be better off looking for the maximum harm.

    Much like they do in civil engineering. They don’t build bridges with a 10% margin for error, they work out what the WORST that could happen and then double the protection against THAT.

    Comment by Mark — 10 Sep 2008 @ 1:09 PM

  159. Really, the models are not tuned to the trend. – gavin]

    Maybe we just have a terminology difference.

    Way upthread I said: “Now, GCM modelers don’t necessarily estimate the free parameters by directly inferring them from model behaviour, but they certainly adjust the parameters to get the model output to “fit” the 20th century instrumental record

    By “parameters” I mean any input to the model for which a value is not already precisely determined. That includes uncertain physical constants, but also boundary conditions such as aerosol forcing at each point in time. Note also that “adjusting the parameters to fit the 20th century record” does not have to involve heavy duty parameter tweaking and re-estimation. It can be as benign as picking one parameter value over another because it gives a roughly more plausible shape to the model outputs.

    The question of whether models are tuned (at least to some extent) to the 20th century record seems to have been substantiated by Kiehl, which Timo Hämeranta linked to in #78 above [I had read this previously]:

    Kiehl, Jeffrey T., 2007. Twentieth century climate model response and climate sensitivity. Geophys. Res. Lett., 34, L22710, doi:10.1029/2007GL031383, November 28, 2007

    “Climate forcing and climate sensitivity are two key factors in understanding Earth’s climate. There is considerable interest in decreasing our uncertainty in climate sensitivity. This study explores the role of these two factors in climate simulations of the 20th century. It is found that the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity. Much of the uncertainty in total anthropogenic forcing derives from a threefold range of uncertainty in the aerosol forcing used (i.e. tuned) in the simulations…”

    Maybe GISS does not suffer from this, but it would be very difficult not to.

    Comment by mugwump — 10 Sep 2008 @ 1:17 PM

  160. RE Mark #157:

    As f approaches 1. Never gets there. Hence your log 0 is incorrect.

    You can write it as a limit if you want. You still get no upper bound on the expected climate sensitivity from Roe and Baker’s model.

    Now, you could argue that allowing f to range all the way to 1 is unphysical, and it should be limited to some number smaller than 1. But that’s modifying the gaussian distribution on f. If you go down that road you also have to address the extreme skewness in the distribution on climate sensitivity induced by a Gaussian distribution on f. It’s just not plausible, so the distribution on f should be very different from gaussian. But once you do that you pretty much invalidate the whole paper.

    Comment by mugwump — 10 Sep 2008 @ 2:04 PM

  161. RE: #145

    And if I hear one more fool hand-wave the issue away with some comment about ’solar and wind’, I will lose it.

    Many people post comments here to the effect that we can’t move away from carbon-intensive energy generation. Sometimes it’s as a taunt, such as “why do you believe climate models but not economic ones.” I’ve been asking for their references, so far without success. Your statement is along these same lines, e.g. “abundant fossil fuels underpin our entire civilisation and your entire way of life” implies that we can’t change. Could you provide a reference so that I and others don’t think you’re the one “hand waving the issue away?” Thanks!

    Comment by Paul Melanson — 10 Sep 2008 @ 2:21 PM

  162. A friend of mine pointed me to this. Nice explanation of the complexity of the issue from Spencer. Not only is there carbon dioxide, we also have methane, nitrous oxide and the halocarbons. And then we throw in sun, wind, wind patterns, lapse rates, water vapor feedbacks, conduction, convection, chemical reactions and all the kinetic interactions with the gases that do not absorb infrared. Toss in land-use changes and assorted waste heat, aerosols and particulates on the ground, cloud behavior and the like, and it is quite a complicated mix. I would certainly expect that if energy in is balanced by energy out roughly over some time scale, after taking into account the heat storage capacity of the oceans et al, then since both the satellite and land/sea anomalies show a increased trend in the lower troposphere, that would mean something else would have to show a decreased trend. Such as the stratosphere does.

    What perplexes me is that there’s any conversation relating to the role the above listed greenhouse gases play. The IPCC in AR4 is quite clear that they estimate the four main well mixed gases provide a total of up to 2.8 Wm^-2 There seems little to argue about over that figure.

    We also know that our satellite and land/sea anomaly trends are up, and the evidence suggests that the greenhouse gases contribute some net percentage to this trend rise. The IPCC specifically targets the burning of ‘fossil fuels and land use changes’ to this observed rise in the near surface numbers since the industrial era and the associated increases in population, urbanization, and technologization.

    So rather than ask “What does doubling CO2 do?” the real two questions are perhaps “What does it take in total to make that radiative forcing number double?” and “What is the net effect in the system of raising the total from 2.8 to 5.6?”

    It seems the verbal sparing over the lag of the oceans, the appropriateness of the models, and the other factors involved is rather a distraction. It seems a common starting point would be that it’s unclear what another 1 or 2 or 5 or 20 Wm^-2 will do in reality, or what actual effect limiting anthropogenic greenhouse gas emissions will have on the enhanced greenhouse effect. Then it can be worked from there to specifics. The first part of finding out where you’re going is to determine where it is you’re at.

    Comment by Thomas Hunter — 10 Sep 2008 @ 3:38 PM

  163. Mugwump, where in that quote do you get any indication that sensitivity is fit to optimize agreement with the data? I see only discussion of uncertainties. Yes, you can get different values for forcings and for sensitivity–hence the range 2 K-4.5 K per doubling–to be consistent with trends. And different models produce slightly different results.
    It is clear that you really don’t understand how the modeling is done. Why not read some of the papers Gavin has recommended and learn it. At least then you wouldn’t be arguing against straw men of your own devising.

    On another note: You’ve said that a major factor in your opposition to the consensus science was the way “environmentalists” use it to promote their agenda. Well, by rejecting the science, aren’t you yielding it to them to do with as they will? Put another way, if Al Gore had been standing alongside John McCain or Jim Baker or even Pat Robertson, do you think he would have an Oscar and a Nobel Peace Prize now?

    Comment by Ray Ladbury — 10 Sep 2008 @ 3:48 PM

  164. Mugwump, There’s nothing magical about the Normal distribution other than it’s role in the Central Limit theorem–and here we are not concerned with central tendencies. Bayesian methods offer one method of dealing with the high-end tail. See for example:
    http://www.jamstec.go.jp/frcgc/research/d5/jdannan/prob.pdf

    I’ve pointed out in the past that the choice of Prior is arbitrary, and that we are still in the realm where the results are heavily influenced by the choice of Prior. However, you can’t blame the models–the data are the data. All you can do is try to gather more.

    Comment by Ray Ladbury — 10 Sep 2008 @ 4:13 PM

  165. Greg (145) — There are already harms from the piddley global warming so far. As a good engineer, you will want to discover what these have been.

    [Capcha agrees, stating "in concerning".]

    Comment by David B. Benson — 10 Sep 2008 @ 4:16 PM

  166. Gavin,

    Apropos reading tea leaves how does “Model inadequacy is a serious barrier to the interpretation of model results for decision support” count?

    One of many, as I am sure you know, of the critiques of climate modelling in Stainforth et al, Phil. Trans.R. Soc. A (2007) 365, 2145-2161

    [Response: For what? Some decision-makers want model output at the grid box level to be meaningful - but clearly, models are inadequate for that purpose. It does not mean that they are inadequate for all purposes. - gavin]

    Comment by Dave Andrews — 10 Sep 2008 @ 5:11 PM

  167. Mark (#135) said:

    “the height above sea level of one optical depth at that wavelength is a lot higher. Higher = cooler. Cooler = less re-radiation and so more entrapment of that energy within the system.”

    Yes, this is the main idea of AGW in general, higher = cooler = less OLR. But is this concept really true, when looking into details? The reason of my question was to confirm that the official AGW forcing comes from sideband broadening that covers about 7-14 cm-1 of IR band. AT the same time, there is a 150 cm-1 wide range of CO2 absorbtion band where “it is saturated”. But does it really “saturated” and therefore does not contribute anymore? In fact, the emission boundary in 700 cm-1 band extends above tropopause, into stratosphere, where higher = warmer = more OLR! As result it is supposed to cause stratospheric cooling, a well admitted effect. Shouldn’t the cooling over a 10x-20x wider spectral window at least counter-balance the effect of mid-troposphere sideband forcing? That would easily explain the lack of observed “warming”. Of course, there are thousands of climate reseachers around the world, they should not make this kind of omission, right?

    Comment by Al Tekhasski — 10 Sep 2008 @ 5:12 PM

  168. RE Ray Ladbury #163:

    Mugwump, where in that quote do you get any indication that sensitivity is fit to optimize agreement with the data?

    I assume you are referring to the Kiehl reference. I am not claiming sensitivity is optimized. Sensitivity is hard to control manually. It’s aerosol forcing that is optimized.

    Kiehl shows that the different climate models exhibit a wide range of sensitivities, but nevertheless successfully track the 20th century temperature record. How can they do that? By manually adjusting the aerosol forcings (which are up for grabs within a wide range). Whence Kiehl’s remark:

    … the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity. Much of the uncertainty in total anthropogenic forcing derives from a threefold range of uncertainty in the aerosol forcing used …

    Comment by mugwump — 10 Sep 2008 @ 6:17 PM

  169. spencer -
    too bad such explanations didn’t stick with with G HW Bush’s advisor John Sununu. As an engineer he ‘played around with’ the results of a climate model and convinced himself (and allies in the administration) the world needn’t be too concerned. The issues he brought up almost 20 years ago are still being promulgated by confusionists.

    The Saga of My Greenhouse Effect on the White House
    http://www.ncar.ucar.edu/archive/documents/manuscript/481_1990_greenhouse_saga.pdf

    Comment by J. Althauser — 10 Sep 2008 @ 6:21 PM

  170. RE #163 Ray Ladbury:

    On another note: You’ve said that a major factor in your opposition to the consensus science was the way “environmentalists” use it to promote their agenda.

    I don’t believe in this idea of “consensus science”. Is Roe and Baker, pretty much demolished in this thread, part of the “consensus science”?

    I am simply opposed to bad science, of whatever flavour (and yes, there is plenty of shockingly bad science from some prominent skeptics too).

    There does appear to be a strong selection bias in much of the literature that, in my opinion, is driven by a fairly pervasive environmentalist element amongst portions of the academic community. For example, had Roe and Baker been supportive of the skeptical view, I suspect it would have been scrutinized far more carefully by the realclimate folks (they would have no doubt eviscerated it using similar arguments to those I have put forward).

    if Al Gore had been standing alongside John McCain or Jim Baker or even Pat Robertson, do you think he would have an Oscar and a Nobel Peace Prize now?

    Of course not. Hollywood is wall-to-wall Democrat. And the Nobel peace prize committee ain’t exactly a right-wing think tank. What’s your point?

    Comment by mugwump — 10 Sep 2008 @ 6:28 PM

  171. “But the integral of 1 / (1-f) is -log(1-f). So the integral from 0 to 1 of 1 / (1-f) is -log (1-1) + log (1-0) = -log(0) = infinity.”

    Time to lighten up-
    Also the(indefinite) integral of d(cabin)/cabin = Natural log cabin+C.

    Comment by Lawrence Brown — 10 Sep 2008 @ 6:46 PM

  172. Re #164 Ray Ladbury:

    I’ve pointed out in the past that the choice of Prior is arbitrary, and that we are still in the realm where the results are heavily influenced by the choice of Prior. However, you can’t blame the models–the data are the data. All you can do is try to gather more.

    Well, the whole point of the Roe and Baker paper is to show a normally distributed feedback factor f yields a very long-tailed climate sensitivity distribution. That’s fine, but such a long-tailed distribution clearly does not represent our true uncertainty concerning the climate sensitivity, so it’s unclear what the point of the exercise is.

    Comment by mugwump — 10 Sep 2008 @ 7:00 PM

  173. > pervasive environmentalist element

    Look, there are nitwits lacking science education on all spokes of the political wheel, some of then very far out from the center, very firm in their faith in what they believe. So what?

    If you come from that kind of position, and wish to discuss the world with people in the sciences — for example issues about ecology — you aren’t even talking the same language.

    Regardless of what your political position is, if you haven’t learned the science, you’re uneducated about the science and your faith and beliefs are just political positions.

    Avoid the “ists” and the “ians” and the “ites” — and learn from the “ologists” — and you’ll’ve made a start.

    Education. Not required for politics. Required for civilization, on the longer term, though.

    Comment by Hank Roberts — 10 Sep 2008 @ 7:09 PM

  174. Is Roe and Baker, pretty much demolished in this thread, part of the “consensus science”?

    As always, Mugwump conquers in his own mind. His genius is special, because he only demolishes mainstream science in the blogosphere, not in the scientific arena, where it counts.

    He’s also proven that he knows more about GCMs than those who write them. I’m very, very, impressed.

    Sorry, mugwump, you live in a fantasy world.

    Comment by dhogaza — 10 Sep 2008 @ 8:25 PM

  175. It’s always worth remembering that the only reason climate denialism has persisted so long is the low level of scientific education among both U.S. media reporters and the general public, by the way.

    It’s also worth noting that the basic notion was pretty solid back in 1978, and the general predictions have only been slightly modified since then.

    Try a 1974 paper, for example. Manabe & Wetherald, Effects of Doubling the CO2 Concentration on the climate of a General Circulation Model.

    An attempt is made to estimate the temperature changes resulting from doubling the present CO2 concentration by the use of a simplified three-dimensional general circulation model. This model contains the following simplications: a limited computational domain, an idealized topography, no beat transport by ocean currents, and fixed cloudiness. Despite these limitations, the results from this computation yield some indication of how the increase of CO2 concentration may affect the distribution of temperature in the atmosphere. It is shown that the CO2 increase raises the temperature of the model troposphere, whereas it lowers that of the model stratosphere. The tropospheric warming is somewhat larger than that expected from a radiative-convective equilibrium model. In particular, the increase of surface temperature in higher latitudes is magnified due to the recession of the snow boundary and the thermal stability of the lower troposphere which limits convective beating to the lowest layer. It is also shown that the doubling of carbon dioxide significantly increases the intensity of the hydrologic cycle of the model.

    Guess it was just honest confusion on the part of the media for the past 30 years… hardly. It was deliberate propaganda aimed at preventing a large-scale shift away from fossil fuels and towards renewable energy, and it is still going on today.

    Comment by Ike Solem — 10 Sep 2008 @ 8:33 PM

  176. Mugwump, Aerosols adjusted by hand? No. However, there is considerable uncertainty in aerosol forcing, so a wide range is tolerated. One can equally look at this as a matter of skill of the model in predicting the right aerosol forcing.

    My point about Gore is that without his climate schtick, he’d be just another washed up politician. Because folks on the right have either rejected good science or refused to stand up and call for action upon that science, Al Gore has been left alone on that bully pulpit. The result: An Oscar, a Nobel Peace Prize and about 8% odds according to Vegas bookies that he’ll one day be president.

    Likewise, climate change has gained increasing acceptance–hell both major party candidates say it needs to be addressed, as well as folks from Al Sharpton to Pat Robertson. By continuing to reject the science–which it is clear you have not yet understood–you are merely leaving one more seat on the right of the negotiating table unoccupied. The left side is pretty crowded.
    So, my suggestion, while you are here, take some time and learn something about the science. You may not be convinced, but at least your arguments will be more on point and hopefully you will see why the science has convinced intelligent, educated folks in the political center and even on the right, not just the left.

    Comment by Ray Ladbury — 10 Sep 2008 @ 8:47 PM

  177. Thanks for the post.

    My background is astrophysics which, like atmospheric physics, often relies heavily on numerics. I’m always amazed how physicists expect complex phenomena to be reducible to simple back-of-the-envelope calculations. I see the same thing all the time in my current job where I work mostly with engineers who are always suspicious of an answer that can’t be spit out in a few lines of math. Sometimes this is possible; sometime’s it’s not.

    While I myself am quite adept at back-of-the-envelope gynmastics, I recognize that there are some problems for which you quite simply just cannot give even the roughest quantitiative estimate using such methods. I guess the expected warming from doubling CO2 is one of those.

    Comment by Peter Williams — 10 Sep 2008 @ 9:04 PM

  178. Dan Hughes,

    I had exactly the same question as you did a few months. I found it very helpful to:

    - buy a couple of textbooks on the subject
    - read them carefully
    - when reading code read the text in concert

    If you include the attached references the documentation for the stuff I’ve read is more than adequate.

    Comment by Patrick Caldon — 10 Sep 2008 @ 9:33 PM

  179. RE Ray Ladbury #175:

    One can equally look at this as a matter of skill of the model in predicting the right aerosol forcing.

    No one cannot. The models use a wide range of differing values for the aerosol forcing. They can’t all be correct.

    Because folks on the right have either rejected good science or refused to stand up and call for action upon that science,

    A lot of it is not “good” science. Calls for action are more often than not based upon the more extreme and unjustifiable projections of climate sensitivity such as in the Roe and Baker paper.

    Al Gore has been left alone on that bully pulpit. The result: An Oscar, a Nobel Peace Prize and about 8% odds according to Vegas bookies that he’ll one day be president.

    I’ll take those odds. He’ll never be president.

    The reasons for Al Gore’s Oscar and Nobel Peace Prize are a matter of opinion. I’ve given mine. You clearly differ. It’s not something that particularly concerns me.

    So, my suggestion, while you are here, take some time and learn something about the science. You may not be convinced, but at least your arguments will be more on point

    Puhlease. I understand the science just fine. I have read a ton of the original papers. [edit]

    [Response: Dealing with uncertainties in the forcings is necessary - otherwise you can't deal with the true spread of uncertainties. However, the range of 20th C trends ranges from 0.3 to 1.1 deg C in the different AR4 models, so the idea that they were all tuned to get exactly the right answer is just wrong. For those that had trends about the right level, the range of aerosol forcings, sensitivity and internal variability provide a consistent simulation of what might have happened. - gavin]

    Comment by mugwump — 10 Sep 2008 @ 10:31 PM

  180. Al Tekhasski #167:

    In fact, the emission boundary in 700 cm-1 band extends above tropopause, into stratosphere, where higher = warmer = more OLR! As result it is supposed to cause stratospheric cooling, a well admitted effect. Shouldn’t the cooling over a 10x-20x wider spectral window at least counter-balance the effect of mid-troposphere sideband forcing? That would easily explain the lack of observed “warming”.

    You can look this up, as Hank would say. Go to David Archer’s on-line model:

    http://forecast.uchicago.edu/Projects/modtran.html

    Remove, for simplicity, everything but CO2. See the little “counter-peak” in the middle of the 600-800 cm^-1 band. That’s stratospheric cooling. High CO2, radiatively cooling the surrounding air.

    Double CO2. See the “flanks” of the band move outward. That’s the tropospheric greenhouse effect. See also how the “counter-peak” strengthens. But it’s still a lot smaller than the movement of the flanks.

    But you’re definitely on the right track.

    Of course, there are thousands of climate reseachers around the world, they should not make this kind of omission, right?

    Right…

    Comment by Martin Vermeer — 11 Sep 2008 @ 3:30 AM

  181. Al #167:
    “But does it really “saturated” and therefore does not contribute anymore?”

    But the thickness of the saturation layer is thicker.

    Like putting another blanket on your bed: the lower blanket blocks the same way as it did on its own. So why are you warmer?

    Your 2degrees per doubling is a good *approximtion* and you DID say “order of magnitude” not “accurate”, so why do you say it cant be 4? That’s within your “order of magnitude”. If you wanted “within +/- 10%” then your process was inaccurate. And one reason is that the thicker layer blocks more, so ignoring it completely is incorrect and inaccurate.

    So which were you looking for? Order of Magnitude or Accurate?

    Comment by Mark — 11 Sep 2008 @ 3:39 AM

  182. mugwump, #160:

    “You can write it as a limit if you want. You still get no upper bound on the expected climate sensitivity from Roe and Baker’s model.”

    Only under Richard’s assumption of how to model it.

    And you can’t have infinities they cause a lot of problems, but the demand from R is that the weighting results in an integral of 1/(1-f) when f goes to 1. Why?

    “Roe and Baker (RB) model the “total feedback factor” f as a gaussian. Climate sensitivity is proportional to 1 / (1-f). Obviously, we should bound f to be between 0 and 1, which a gaussian doesn’t do, but we can just truncate and rescale to get a valid probability density function for f, which will still have the functional form of a gaussian for f between 0 and 1.”

    So RB has a factor F as a gaussian, but bounding f to be 0-1 is not gaussian. A demand brought in for no apparent reason. Why can it not be between 0.1 and 0.4? No infinities there. After all, it isn’t a Gaussian for your purposes either.

    The proportionality only works if you normalise it but that requires you divide the values by the area under the curve. Your and Richard’s requirement give a divisor of infinity. So it only has value at f=1. Everywhere else it is zero.

    Given that there are so many issues with what you brought forward as “truth”, where are the resolutions to this?

    Heck, ignore all that and what does this problem (if it did exist as you portray it) have to do with climatology and AGW? Nowt. The models don’t use f=1. So it breaking down at f=1 is irrelevant.

    Harmonic oscillation has a pendulum being forced back by a gravity force that is proportional to the angular difference from vertical.

    It is assumed and all the maths uses this approximtion. However, the force is proportional to the sin of the angle offset, which is an infinite series. And this makes it inaccurate. But it is taken as being so near 0 in angle that an approximation to linear is good. But it still uses “theta”. And if you take theta to 50degrees angle, the motion is no longer harmonic. Does this mean we cannot use harmonic motion to define a pendulum’s movement? No. We still have pendulum clocks. We just have them moving slightly.

    Same here.

    Within a range of accepted sensitivities, it will approximate to 1/(1-f). Outside this range, this ratio will become less and less accurate because the 1/(1-f) is an approximation.

    Yet you demand we take this equation to areas it doesn’t hold and surprise surprise, we get an answer that is incorrect and invalid.

    Whoop de do.

    I guess approximating a normal distribution to a selection of 10,000 respondents must be dropped because it breaks down when you have 3 respondents.

    Comment by Mark — 11 Sep 2008 @ 3:58 AM

  183. rutger posts:

    we actually need this warming.. nobody likes glacials

    If you actually do the matrix math that governs Milankovic cycles, you find that the next significant stades are at 20,000 and 50,000 years from now. Ice ages can wait; right now we have to deal with global warming.

    Comment by Barton Paul Levenson — 11 Sep 2008 @ 5:25 AM

  184. greg posts:

    Here’s a little tip for those of you lounging in comfort in the ivory towers of academia – abundant fossil fuels underpin our entire civilisation and your entire way of life.

    No kidding, really? I’m glad you told me that. I didn’t know.

    You eat well, live well and have the life expectancy and health that you do entirely because of free and available energy.

    Another fact I just wasn’t aware of!

    The burden is on you to provide the proof.

    Proof is for formal logic or mathematics, not science. Science can only disprove things.

    And if I hear one more fool hand-wave the issue away with some comment about ’solar and wind’, I will lose it.

    Gosh, we wouldn’t want that to happen.

    To answer Ray Ladbury’s “Doctor, Doctor, it hurts when I raise CO2 levels.”, I say it doesn’t hurt one whit. He can go back to the 1700s and then we’ll see how much it ‘hurts’. Unless he defines ‘hurt’ as ‘comfort, wealth and prosperity’.

    Am I worried about global warming? A little, if at all. One hundred and fifty years ago, no cars, no planes, and oil was a black ooze that came out of the ground, not an energy source. Sixty years ago, uranium was a funny rock that glowed in the dark. A hundred years from now, who knows? And who cares? Do you think that if some genius modelled the greenhouse effect in 1750, then advised everyone not to bother with the industrial revolution in order to ’save their children’, we’d be better off now?

    Interesting question. If we had built our present civilization on solar and wind and biomass energy, history might well have been a little easier on great numbers of people. For example, if Londoners weren’t burning coal to heat their homes in the period 1750-1875, a lot fewer little kids wouldn’t have been enslaved as chimney sweeps, to work in soot for no pay, get scrotal cancer, and be kicked out when they got too big to crawl through the chimneys. For that matter, we wouldn’t have had the mass death of the 1952 London inversion, or the deaths in Donora, PA in 1948. Interesting speculation indeed.

    Personally, I advise you all to grow a pair,

    I think some of the posters here may be girls, greg. Better have something available to deal with the cooties!

    and have some faith in the human race.

    Personally, I have faith in God. It’s hard to work up much enthusiasm for faith in the species that brought us Auschwitz, Kolyma, “rape camps,” slavery, child molesting and Darfur.

    Full steam ahead.

    And over anyone who gets in our way, eh?

    Let us grow wealthier, stronger, and smarter, and *if* this turns out to be a problem we’ll solve it.

    It has already turned out to be a problem. Ask the Australians. And we know how to solve it — switch to renewable sources of energy.

    Comment by Barton Paul Levenson — 11 Sep 2008 @ 5:39 AM

  185. Ray Ladbury posts:

    if Al Gore had been standing alongside John McCain or Jim Baker or even Pat Robertson, do you think he would have an Oscar and a Nobel Peace Prize now?

    Point of information — Pat Robertson accepts global warming and urged his congregation to act on it. He has even appeared in an ad on the subject with Al Sharpton, of all people.

    Comment by Barton Paul Levenson — 11 Sep 2008 @ 6:00 AM

  186. # 173, Ike,

    Nice quote. What about this one?

    “Manabe is a man who feels pretty sure of what he knows but who also keenly feels the limits to knowledge. Both aspects of his intellectual character contribute to another irony -his skepticism about just how serious a problem global warming really is. Though Manabe was the first to treat the greenhouse effect just right, and though he is credited by lifelong colleagues with always having kept his eye sharply focused on the key roles of the greenhouse gases in climate, he does not exactly share their acute concern about where rising carbon dioxide levels are taking us

    From William Sweet, ‘Kicking the Carbon Habit’, Columbia University Press, 2006, p108, (emphasis added)

    Comment by Dave Andrews — 11 Sep 2008 @ 6:10 AM

  187. Off topic but I thought you would like to hear what Jack Koenig,of the anti AGW group Mysterious Climate Project, has to say about the evolution by natural selection. I don’t know how you can hold a scientific debate if you reject one of the firmest theories we have!

    “Could you please list some of the evidence? As far as I knew, evolution is still just a theory, a theory which may have substance on one hand, and not on another. You also have to remember the evolution theory was hammered into everyones’ skulls by the Lamestream Media (ABCNNBCBS) in much the same manner as they are hammering AGW into everyones’ skulls.”

    The whole of this link from ‘Watts up with that’ is here http://wattsupwiththat.wordpress.com/2008/09/08/uah-global-temperature-dips-in-august/#comment-38350
    Enjoy!

    Comment by Mary Hinge — 11 Sep 2008 @ 6:46 AM

  188. RE Mark #182:

    So RB has a factor F as a gaussian, but bounding f to be 0-1 is not gaussian. A demand brought in for no apparent reason.

    Mark, the climate sensitivity is proportional to 1 / (1-f). Negative f would imply negative feedback, which is pretty unlikely. [By negative feedback I mean that the temperature rise from increasing CO2 is less than what you'd get if there was no feedback]. Hence the lower bound on f of 0. [NB: you don't actually need this lower bound for the calculations I gave because the tail of Roe and Baker's distribution for f less than zero is negligible]

    For the upper bound, climate sensitivity increases to infinity as f approaches 1. If you allow f to go above 1, the sign flips and you get a sensitivity of -infinity ranging up to zero as f increases from 1 to infinity. Negative values of sensitivity are even more unphysical than negative feedback, since they imply temperature actually decreases as CO2 increases Hence the upper bound of 1.

    Given that there are so many issues with what you brought forward as “truth”, where are the resolutions to this?

    Mark, the issues are in your head. Read the paper, and ask yourself honestly if you really understand it. [edit]

    Comment by mugwump — 11 Sep 2008 @ 8:20 AM

  189. Barton, Yeah, I know about Robertson and the video with Sharpton. However, Gore is still the “lone crusader,” and this has helped Al even as it’s hurt climate science (Al is a whipping boy for the political right–they won’t even agree the Sun rises in the East if he asserts it.). Had anyone on the right been up there with Al, it would have de-emphasized him and made the reality of climate change more acceptable to the right. As it is, they will oppose it to their dying day.

    Comment by Ray Ladbury — 11 Sep 2008 @ 8:23 AM

  190. The comment about Angstroms saturation experiment (70) came from Raypierre (Uncertainty, noise and the art of model-data comparison), not you Gavin. Sorry.

    “On the other hand, the carbonate chemistry used by R&S is standard high-school stuff, and there’s a good chance it would have been discovered much earlier if people had paid more attention to Arrhenius, and if Ångström hadn’t set back the field by a highly publicized but botched experiment. –raypierre”

    However, to reject Angstrom’s results out of hand you should repeat the experiment. Here are the results from a crude high-school experiment reported on the internet. Air, and 100% CO2 at 22 degrees, were subjected to identical infra-red irradiation.
    After 5 minutes, the CO2 warmed 15 degrees C and the air 10 degrees C.
    But in successive 5 minute intervals the temperature increases were:
    Air 100% CO2
    3 degrees 3 degrees
    5 degrees 5 degrees
    4 degrees 5 degrees
    The experimentalist thought that this demonstrated that CO2 acted as a greenhouse gas. I think it demonstrated that Angstrom’s saturation effect was correct.
    Has this experiment ever been done properly, and for more than 20 minutes?

    [Response: I told you before - the HITRAN database is based on much more sophisticated measurements of these effects for individual lines of the spectrum and at multitudes of different temperatures, pressures and mixtures of gases. Experiments in high school are all very interesting, but hardly what NIST and others spend their time doing. - gavin]

    As for your second comment, the second law of thermodynamics is a statement about the difference between energy and heat. Energy can transfer from a colder body – heat can’t. If that were made clear in all comments, (the colder atmosphere cannot warm the earth directly) simplistic explanations of AGW would not be repeated.

    [Response: This is nonsense. Downwelling LW radiation from the atmosphere can be measured even by high schoolers. But apparently this doesn't count in the energy budget of the surface? Energy=heat in this context. - gavin]

    You reject Essenhigh’s comments because he neglects the effect of absorption in the upper atmosphere. This is the point of Barton’s piece (62) explaining surface warming via relative absorption in the upper atmosphere. If you express the surface temperature increase (via Stefan-Bolzmann) as a function of the upper atmosphere absorption relative to the lower atmosphere, the effect is significant only if the upper layer absorption is more than 10% of the lower.
    But the H2O absorption is absent (at least a factor of 10) and the pressure is lower by a further order of magnitude. At a relative absorption of .01, the effect of doubling the CO2 in the upper atmosphere is negligible.

    [Response: No, it just isn't. (And for these purposes, the upper atmosphere is 5 to 6 km above the surface). We are not talking about the stratosphere. - gavin]

    As for the CO2/H20 spectral overlap, Chuck Wiese (on Dr Pielke’s web site) used HITRAN to calculate the impact of doubling CO2. His results gave an increased flux of about 2 watts per square metre, half of which radiates down. The temperature increase to compensate (via Stefan-Bolzmann) is 0.16 degrees C.

    [Response: I have no idea what he's done, but it's wrong. The forcing at the tropopause (after stratospheric adjustment) from 2xCO2 is around 4 W/m2 using HITRAN (Myhre et al, 1998, Collins et al 2006 etc.). When Wiese gets his result past peer review, let me know. - gavin]

    You do not comment on the UAH lower and mid troposphere temperature trends, which, since 1978, are obviously not compatible with the “higher is colder” AGW theory.

    [Response: Huh? higher is colder is from the adiabat which has nothing to do with trends since 1978. - gavin]

    As for the Little Ice Age, it was not visible in the original hockey-stick (George Monbiot abolished both it and the Medieval Warm Period in a contemporary Guardian article). Now it is visible, and it gets a mention in the latest post – “Progress in reconstructing climate in recent millennia”.

    [Response: Huh? (again!). What on earth are you talking about? The cooler period in the 17th-18th century (and also in the 15th) is visible in all reconstructions. I've even written papers about it. - gavin]

    The non-scientific community is heavily influenced by the certainty expressed on this web-site. Is it justified?

    [Response: The only certainty is that the number of people who keep posting completely rubbish half-digested contrarian nonsense seems to be a constant. - gavin]

    Comment by Fred Staples — 11 Sep 2008 @ 8:42 AM

  191. RE gavin #179:

    However, the range of 20th C trends ranges from 0.3 to 1.1 deg C in the different AR4 models, so the idea that they were all tuned to get exactly the right answer is just wrong.

    I never said they were all tuned to get exactly the right answer. In fact, I was quite careful in #159 to point out that the process does not even have to involve explicit tuning:

    Note also that “adjusting the parameters to fit the 20th century record” does not have to involve heavy duty parameter tweaking and re-estimation. It can be as benign as picking one parameter value over another because it gives a roughly more plausible shape to the model outputs.

    If, as you claim, modelers choose aerosol forcings without a view to the impact on 20th century fidelity, then you’d expect no correlation between the choice of aerosol forcing and model climate sensitivity. OTOH, you’d expect an inverse relationship if there is any feedback from 20th century fidelity to aerosol choice in the modeling process. Kiehl shows there is a strong inverse relationship.

    Comment by mugwump — 11 Sep 2008 @ 8:48 AM

  192. Mugwump, re our conversation:

    RL: One can equally look at this as a matter of skill of the model in predicting the right aerosol forcing.

    M: No one cannot. The models use a wide range of differing values for the aerosol forcing. They can’t all be correct.

    Actually, one can. If the model has to assume an improbable value for aerosol forcing to achieve agreement, it has less skill.

    M: Puhlease. I understand the science just fine. I have read a ton of the original papers. [edit]

    Uh, given that you don’t seem to understand the difference between statistical modeling and dynamical modeling, I would beg to differ. Look, this is not simple science. I have a PhD in physics and had to devote a year of my spare time to understand the basics of the subject. Have you, for instance, perused Raypierre’s book?

    Comment by Ray Ladbury — 11 Sep 2008 @ 9:02 AM

  193. Uh, given that you don’t seem to understand the difference between statistical modeling and dynamical modeling, I would beg to differ.

    Come on Ray. That’s a false dichotomy. GCMs are a combination of both, whichever way you cut it.

    Look, this is not simple science. I have a PhD in physics and had to devote a year of my spare time to understand the basics of the subject.

    There’s no way of saying this without sounding obnoxious, but since you showed me yours, I’ll show you mine. I have a PhD in mathematical statistics and computer science. I did two years of a PhD in theoretical physics before I got interested in my present field and switched topics. I was the top undergraduate student in theoretical physics and pure mathematics at my university, and did a third undergraduate major in computer science. I worked as an academic for 5 years before moving into industry, and published about 30 peer-reviewed papers with an H-index of 18.

    This is actually pretty simple science compared to my first loves: quantum field theory and general relativity. That’s not to say modeling the climate is easy. Or answering questions like “what is climate sensitivity” is easy. But the basic tools are familiar to anyone with a strong background in physics, statistics, mathematics and computer science.

    Comment by mugwump — 11 Sep 2008 @ 9:49 AM

  194. [Response: The only certainty is that the number of people who keep posting completely rubbish half-digested contrarian nonsense seems to be a constant. - gavin]

    Gavin – you have the patience of a saint but most peoples would have run out around 50 posts ago. You must be pulling your hair out (thats if you have any left of coure ;) )

    Comment by pete best — 11 Sep 2008 @ 10:09 AM

  195. mugwump writes:

    I don’t believe in this idea of “consensus science”.

    Then you don’t believe in science. Modern science runs on peer review and the scientific consensus. And it is so enormously productive we’d be crazy to change it.

    Comment by Barton Paul Levenson — 11 Sep 2008 @ 10:16 AM

  196. Do the models have any prediction for NH?
    We had torrential, lengthy rainstorms all summer. Driving on a freeway all I could see was a flickering tail light in front of me. It was not possible to change lanes. This is not traditional NH rain, and was not necessarily associated with a thunderstorm. The amount of rainfall (and winter snowfall) is up about 50% this year.

    I know that global warming predicts in general more intense, extreme rainfall. Has anyone looked at satellite data to see that this is happening? Do the models predict NH might get this? What can you say about what’s happening here, if anything?
    Thanks.

    Comment by veritas36 — 11 Sep 2008 @ 10:26 AM

  197. Mark, a clarification: one does not linearly continue to get warmer as more and more blankets are added to your bed. At some point adding another blanket will not make any discernable difference.

    [Response: But you only need look to Venus to know that this is a long way off. - gavin]

    Comment by Rod B — 11 Sep 2008 @ 10:44 AM

  198. Mugwump, Great, that answers part of my question: you are educates and intelligent. Now to the other part: Have you read Raypierre’s text or some equivalent?
    Would you expect to understand QED or GR with only a brief effort? Would you expect to catch Steve Weinberg in an error, for instance? Do you think statistical mechanics (which is closer to what we have in climate science) is any easier than these subjects?

    Barton, brings up a good point: Perhaps you would tell us what you mean by “consensus science”.

    Comment by Ray Ladbury — 11 Sep 2008 @ 11:00 AM

  199. Mugwump:

    How many angels can you fit on this pin? For one whose skepticism is based largely on the behavior of others (see your comment 54) you’ve become rather vehement yourself.

    Given the complexities (the subject of the original post), do you not find the temperature trends over the last two decades not only to be persuasive indicators of the models’ validity, but of a certain utility as well? Or am I missing your point?

    Comment by walter pearce — 11 Sep 2008 @ 11:16 AM

  200. This is kinda related to part of Fred’s and Gavin’s, et al discussion, but something that quietly disturbs me (from a knowledge perspective). Is all of the downwelling IR radiation emitted from the internal energy stores of greenhouse gases? The vast majority? (That standard IPCC Radiation Budget image implies that it does.) If this is the case would this not be an example of energy transfer but not heat (per se) transfer, even though the absorbed downwelling radiation energy turns into heat?

    [Response: Downwelling LW (also called IR or thermal radiation!) comes from clouds, water vapour, trace GHGs and aerosols. Whatever you call it, it still occurs and does not violate any of thermodynamics. - gavin]

    On the other hand, if the downwelling radiation comes from non-GHGs doesn’t this emission decrease heat (temperature) of the emitting gas? Wouldn’t this mean that heat (along with energy…) is being transferred from a low temperature to a higher temperature in violation of Thermodynamics? Is the answer here (if there need be an answer) that one can (theoretically) find a certain portion of an atmosphere layer (a “slice of the Boltzmann distribution” if you will) with a higher temperature (heat content) than the layer as a whole?

    [The basic question I have is where does all of that downwelling IR radiation (~85% of the surface IR emission) come from and how does it manage to get back to the surface.]

    Comment by Rod B — 11 Sep 2008 @ 11:38 AM

  201. veritas36, I pasted a chunk of your question in here; click the link:
    http://scholar.google.com/scholar?q=more+intense%2C+extreme+rainfall.+Has+anyone+looked+at+satellite+data

    Comment by Hank Roberts — 11 Sep 2008 @ 11:54 AM

  202. Gavin, if one added one more blanket to the 300-400,000 (relative Venus to Earth CO2??) already on how much warmer would one get? Are you already more than 98.6? How many blankets did that take? If you double CO2 on Venus, what would the temperature be? (Serious curiosity question.) As an aside, how did Venus, at 0.85 the mass of Earth, get about 100bars of pure CO2 anyway?

    Comment by Rod B — 11 Sep 2008 @ 12:08 PM

  203. Gavin; Venus? If I remember correctly, the entire atmosphere of Venus in the lower 2.5 kilometers is all supercritical fluids (that get very little insolation anyway), and various other gases are in that state up various altitudes to 45 kilometers, mixed with the rest in some manner. And there are sulphur clouds from something like 60-90 kilometers that block 60% of the sun, and the planet has no magnetic field and no tectonic plates. Not so much like putting on 20 blankets and having the 21st give no additional help at reducing convection.

    BP Levenson: “If we had built our present civilization on solar and wind and biomass energy, history might well have been a little easier on great numbers of people.”

    If pigs had wings, they wouldn’t bump their butt on the ground when they skip down the street.

    Perhaps you need to re-read the history of the world since the fall of the Roman Empire from around 200 to 500 AD. Pay particular attention to 476-1000 and 1000-1500. Follow that up with short review of the technology available around 1700 and steam engines, 1800-1900 with railroads, the 1930s with diesel-electric locomotives. Then take a look at China in 300 AD, the Middle East in 800, Western Europe in 1100, 1846 in Nova Scotia, and various other developments centering around petroleum, coal and the internal combustion engine in the latter part of the 1800s and early 1900s. Don’t forget to include a deep look into the materials and technology available in 1806 on, after de Rivaz designed and implemented the first internal combustion engine. Which interestingly enough ran on hydrogen and oxygen.

    Here’s a little jumpstart:

    http://inventors.about.com/library/weekly/aacarsgasa.htm

    Comment by Thomas Hunter — 11 Sep 2008 @ 12:15 PM

  204. Rod B: “As an aside, how did Venus, at 0.85 the mass of Earth, get about 100bars of pure CO2 anyway?”

    I am not sure if the lack of its own internal magnetic field allowing the solar wind to blow away lighter elements like hydrogen, the basic absence of water vapor, and the basic lack of non-IR reactive substances in the atmosphere has much to do with it. But as I mentioned, the inability of Venus to lose heat due to sulpur clouds high up in the extensive troposphere, no tectonic plate activity and plenty of volcanos et al, should account for all that heat and pressure. The planet doesn’t reach semi-Earthlike conditions until about 50 kilometers. There is also the lack of axial tilt, and 240+ days each side faces the sun. And no oceans or moon.

    Quite a curious sister, no?

    [Response: It is, but not for the reasons you state. It is not hot because of volcanism, but because of CO2, as Sagan suggested 30 years ago and as was borne out by the Pioneer and later probes. - gavin]

    Comment by Thomas Hunter — 11 Sep 2008 @ 12:23 PM

  205. This may be a naive question but I seem to be getting mixed messages on this from all over the place (including here).
    Are the models actually published? Can one inspect the actual equations? Are the methods by which each equation is derived documented? Are the methods used to determine each parameter documented? Are these all public? Or are the models just black boxes with the inner workings only known to their developers?

    [Response: Some models are completely open (NCAR CCSM, GISS ModelE, EdGCM etc.), others are not for a variety of reasons. All of the results are available either through PCMDI, or through secondary gateways like Climate Explorer. Documentation for each model varies in quality (NCAR's is probably the best). - gavin]

    Comment by Chris Maddigan — 11 Sep 2008 @ 12:24 PM

  206. Please can we have mugwump and Ray Ladbury debating on television.
    Now!

    Comment by japes — 11 Sep 2008 @ 12:47 PM

  207. Are people expressing doubts about the possibility of properly modeling climate trends concerned about the models of the computer programs that evaluate them?

    If it is the mathematical models themselves then I would expect that their concerns about misspecification are misplaced. These sorts of physical models are likely to be fairly robust. Slight misspecification should not lead to huge errors in a system so constrained by physical laws and where effects are the sum of many simultaneous effects. We can do simple ballpark estimates of the effect on global average temperatures of the greenhouse gases by themselves. These simple estimates are not that far from what comes out of the more elaborate GCMs. What probably increases CO2 sensitivity by approximately half an order of magnitude is the cloud cover effects. We need the GCMs for these and even then or estimates are not precise. But physics does say there has to be a temperature increase and unless we have a so far undemonstrated negative feedback then this temperature increase will be enough for us to be worried.

    Generally we know that a scientific model is on the right path when it is useful in explaining phenomena other than what it was created to explain, when it becomes part of a coherent overall story. We trust it more when it becomes useful as a framework suggesting further avenues of investigation. Current climate models are useful in helping to understand the paleoclimatic record. A high CO2 sensitivity explains so much that has happened in the past.

    Greenhouse skeptics seem mostly to focus on so far unexplained details prematurely claiming that they cannot be explained by an extension of current models. Their arguments remind me of similar ones from creationists. Because we don’t know everything then we must know nothing. Scientific hypotheses that explain most of what is going on better than the alternatives get tinkered with to explain the discordant data rather than simply thrown out. A useful theory will usually be supported by multiple lines of evidence and are confirmed by having the alternatives disproved rather than being directly proved themselves. Theories like Evolution are confirmed by the convergence of evidence. I think this applies to high greenhouse gas sensitivities too.

    Now if the concern is about the computer programs is the concern about the low level mathematical routines or is it about the way that these routines are put together into a program. I wouldn’t worry about the low level mathematical routines. These will be things like the routine for solving partial differential equations. These routines can be analyzed and tested thoroughly. Mistakes are not likely here and if they were made then because of the ubiquity of these routines the results of these mistakes would usually be spectacular and unmistakable.

    Now will mistakes be made in assembling these low level modules into a program so that the program does not actually calculate the outcome of the model that it is intended to? Possible but if it occurred we would expect different errors in each model implementation. We would not expect different people programming different models to come up with results with the same order of magnitude. You would not expect a coherent picture to emerge from these simulations unless you were on the right track. All these models are simplifications but are adequate approximations for our purposes. There are many adequate ways to model the same phenomena that will give similar results.

    Also, these are programs that give results for all the intermediate stages when we are simulating what will happen over a long time period. If they are not actually modeling what they are meant to be modeling this will usually become apparent early in the simulation. The programs that break down drastically are usually the ones that produce a few pieces of output after a long time and do not give any output before that. The problems can bubble along out of sight in such programs.

    Comment by Lloyd Flack — 11 Sep 2008 @ 12:57 PM

  208. Re Thomas Hunter @203: “Perhaps you need to re-read the history of the world since the fall of the Roman Empire…”

    Perhaps not. If it wasn’t clear to you, Barton’s quoted comment [from 184] was only a rhetorical device. Do you need a jumpstart on rhetoric?

    Captcha: goes gasoline

    Comment by Jim Eager — 11 Sep 2008 @ 12:57 PM

  209. > not hot because of volcanism, but because of CO2

    And the CO2 did not get tied up in carbonate minerals instead because hydrogen (and water) were lost early on?

    http://dx.doi.org/10.1006/icar.1997.5677

    Comment by Hank Roberts — 11 Sep 2008 @ 12:58 PM

  210. > emitted from the internal energy stores of greenhouse gases

    Rod, no. We’ve gone ’round this many, many times.

    Greenhouse gases can absorb and emit infrared photons.
    All gases can transfer heat by collisions.
    The average temperature is evened out by collisions between GHGs and non-GHGs very, very quickly.

    Heat is added to or removed from the average amount in the surrounding air only by infrared photons, and only those molecules that can interact with the infrared photons catch and emit them.

    Think of the heat as a baseball. It’s tossed onto the field at the beginning of play, it gets moved around among all the players on the field, but only the batter can knock it out of the park, removing it from play.

    Er, except, for the analogy in which the batter is the GHG, postulate that only the batter can also catch the next baseball when it’s tossed onto the field.

    This is why they use calculus instead of poetry in the major leagues of climatology.

    Comment by Hank Roberts — 11 Sep 2008 @ 1:04 PM

  211. Ray Ladbury #198:

    Now to the other part: Have you read Raypierre’s text or some equivalent?

    It’s been a long time since I read an entire textbook. They tend to be aimed at undergraduates or graduate students and are rather slow-going. That said, since Raypierre’s text is online I will give it a whirl. I usually read the original papers and if there’s some background I don’t have I google it. If that doesn’t yield an answer I go over to google books and query the textbooks there.
    I have probably read around 70 climate science papers this way.

    Would you expect to understand QED or GR with only a brief effort?

    Of course not. It took quite a concerted effort as an undergraduate for me to understand them. But there’s a hierarchy in the hard sciences. Once you can do QFT, GR, and harder areas of Mathematics, the rest is pretty easy.

    Do you think statistical mechanics (which is closer to what we have in climate science) is any easier than these subjects?

    In terms of the abstractions involved, yes. But I don’t think statistical mechanics is close to climate science. Thermodynamics and fluid mechanics are.

    Perhaps you would tell us what you mean by “consensus science”.

    I mean that creature that is invoked to justify all kinds of crazy alarmist nonsense.

    [Response: Then discuss what you term the 'alarmist' nonsense, not the complete red herring of consensus. By avoiding generalities and focusing on specifics, you'll find a great deal more willingness to engage. -gavin]

    Comment by mugwump — 11 Sep 2008 @ 1:14 PM

  212. RE gavin #211:

    By avoiding generalities and focusing on specifics, you’ll find a great deal more willingness to engage. -gavin

    What, like you? Eg, #190: “[Response: The only certainty is that the number of people who keep posting completely rubbish half-digested contrarian nonsense seems to be a constant. - gavin]“

    Besides, I do focus on specifics. Apart from my original generic remarks at #54, the rest of this conversation – aside from the idle banter with Ray concerning his unrequited love affair with Al Gore – has mostly been about the specific problems with Roe and Baker.

    [And I did try to bury the "GCMs are not statistical models" chestnut, since I think it is just a matter of terminology, but people (including you) kept bringing it up]

    [Response: I know. My comment was a reminder. However, while Fred Staples is just regurgitating nonsense, your comments are of a much more reflective nature. If you avoid the bad habits of the trolls, you'll get a better response and less of a knee-jerk response. That's all I was trying to say. - gavin]

    Comment by mugwump — 11 Sep 2008 @ 1:52 PM

  213. Mugwump,
    I have a pretty good book on foundations of stat mech that you might enjoy–the issues are subtle indeed. Try “Physics and Chance,” by Lawrence Sklar. I would dispute that being conversant with one particular field of physics provides adequate preparation for others. You still need to put in the time to understand the basics of a field.

    I’m afraid I agree with Gavin–your definition of consensus science is both vague and pejorative. How about I try a definition: We have consensus on an issue/fact/theory when it becomes indispensable to future progress in the field–that is, when you cannot increase understanding in the field without the concept, when nobody publishes without it, you have consensus. By that definition, there is consensus that climate sensitivity is greater than 2.

    Comment by Ray Ladbury — 11 Sep 2008 @ 2:05 PM

  214. Mugwump, FWIW, I’m not a big Al Gore fan. I thought he ran a lousy campaign. He owes you much more thanks than me. Folks like you have let him have the bully pulpit all to himself.

    Comment by Ray Ladbury — 11 Sep 2008 @ 2:10 PM

  215. Ray,

    We have consensus on an issue/fact/theory … when you cannot increase understanding in the field without the concept. By that definition, there is consensus that climate sensitivity is greater than 2.

    Actually, the opposite is true: demanding that everyone agree that sensitivity is at least 2C guarantees understanding won’t increase. For example, you easily get a sensitivity less than 2C if you take the weaker end of aerosol forcings, and the stronger values for negative cloud feedbacks, all within the range of plausible values given our current best knowledge. A dictate of 2C requires the plausible ranges to be narrowed by fiat, not by science.

    [Response: No. Because there are other constraints - in particular, the LGM. Explain that away using a sensitivity of less than 2 C and then we can talk. - gavin]

    Comment by mugwump — 11 Sep 2008 @ 3:19 PM

  216. To Gavin, David Benson re my posts on 9th Sept (#102 and 108) Thanks for your patient responses. I followed your leads including reading the IPCC tech summary report, plus Spencer’s explanations, including how CO2 works etc – I am satisfied that global warming is real, that CO2 is the major cause, that the models are based on sound science. Continue your patient responses focussed on the science (there are likely to other newbies like myself just starting to try and get to grips the subject) and you will win those who have eyes to see and ears to hear. Good work. Question now is what action to take – and carbon credit trading is unfortunately not working the way it was intended; too open to abuse and impossible to monitor realistically. But I guess that debate is for some other site.

    Comment by Hugh Laue — 11 Sep 2008 @ 3:46 PM

  217. … the LGM. Explain that away using a sensitivity of less than 2 C and then we can talk. – gavin]

    We’ll see. I’ve not had time to look into the sensitivity estimates from the LGM yet.

    However, given the uncertainties associated with deriving sensitivity from GCMs, I have recently been looking at much simpler models, such as those used by Douglass and Knox and Schwartz. Those models point to lower estimates of climate sensitivity – eg Schwartz comes up with 1.9K +- 1K, and I think without the fat tail on the upper end (I just read it in work breaks today so I haven’t had time to verify the upper tail yet).

    [Actually, Douglass and Knox addresses volcanic sensitivity rather than CO2 sensitivity, but their approach is similar to Schwartz and is interesting for the way in which it directly tackles the problem without relying on heavily parameterized models].

    [Response: Neither of these approaches is robust and neither correctly diagnose the sensitivity of models with known sensitivity (ranging from simple energy balance models to GCMs). Both were heavily commented upon for just those reasons. It's worth reading those comments before going too deep. For the LGM discussion, start here. - gavin]

    Comment by mugwump — 11 Sep 2008 @ 4:05 PM

  218. Hugh Laue (216) — What to do properly belongs on other sites. ClimateProgress is one such.

    But as general advice for everyone, plant more biomass! Trees, perennials, annuals, flowers, foods, whatever. Plant lots; we are going to need it.

    Comment by David B. Benson — 11 Sep 2008 @ 5:00 PM

  219. Mugwump, If your simplification strategy involves ignoring data (e.g. the LGM and other paleoclimate data), your chances of constructing a model with low sensitivity increase substantially. I would not expect it to resemble Earth, but you can try since you do not seem the type to learn from the experiences of others. Positing a model that is too simple (e.g. Schwartz) can work to yield non-Earth-like models as well,

    http://www.realclimate.org/index.php/archives/2007/09/climate-insensitivity/

    as can under-constrained/overfit models (e.g. M&M2007)

    http://www.realclimate.org/index.php/archives/2007/12/are-temperature-trends-affected-by-economic-activity-ii/langswitch_lang/zh

    As to the consensus, where has anyone imposed by fiat or any other method that climate scientists only consider sensitivity more than 2 degrees per doubling? Climate models have such sensitivities because they work, while those lower do not. No heavy-handed tactics. No coercion. Not even a vote. Just scientists motivated by self-interest to advance their careers and curiosity to understand their subject matter. That’s how consensus works. A marketplace of ideas.

    Comment by Ray Ladbury — 11 Sep 2008 @ 5:14 PM

  220. Both were heavily commented upon for just those reasons. It’s worth reading those comments before going too deep.

    I have read the Robock and Wigley comments on the Douglass and Knox (DK) paper, plus DK’s response. I think DK are correct: a significant contribution from ocean lag is not justified by the physics or the data. I had a discussion with James Annan on his blog about this, in the context of his 3C estimate. He deleted my final comment which I thought very much clarified the issue. Anyway, I have not seen a convincing counter to the DK paper.

    Judging from the writing, the Scwartz paper I linked to was written after at least one round of comments, but I will read those comments as well.

    [BTW, papers that deviate from the consensus position tend to be heavily commented upon. Whereas much worse papers that go along with the consensus (eg Roe and Baker) seem to get the kid gloves treatment. That's part of the selection bias I was talking about.]

    [Response: My mistake with the Schwarz paper. However, it still suffers from the same flaws as the original one - if given output from a model that is exactly the same as the model he is using, his methodology is not robust. i.e. it does not derive the coefficients (i.e. the timescale or the sensitivity) in an statistically unbiased way. The criticisms of DK by Wigely and Robock, in contrast to your claim, completely undermine their analysis. If a method doesn't even work in a simpler situation where you know the answer, why do you think it will work in the real world?

    As for why bad papers on one side of the fence get more comments than bad papers on the other, it's based on impact. Most bad papers that agree with the vast majority of the work won't have any impact and they will sink quickly into obscurity. Bad papers that contradict the majority of the work stand out much more strongly, and that can activate people to bother to write a proper reply. This is a non-trivial undertaking, that is largely a waste of time in terms of career progression or community appreciation, and yet is one of those necessary community services we are all expected to do without reward. This is not particularly mysterious. - gavin]

    Comment by mugwump — 11 Sep 2008 @ 5:16 PM

  221. RealClimate comments section is a wonderful gathering of engaged scientists sometimes meeting motivated skeptics, and even a few professional deniers.

    I still cannot appreciate the enormity of the AGW problem; and to me the ramifications just work to stress the mind and hinder thinking.

    I suspect that those suffering in their denial are desperately seeking validation for any flaw that could possibly support and nurture their disbelief. Hoping that chaotic systems will somehow accept a human bias.

    Thank you for your civil persistence.

    Comment by Richard Pauli — 11 Sep 2008 @ 5:26 PM

  222. The criticisms of DK by Wigely and Robock, in contrast to your claim, completely undermine their analysis. If a method doesn’t even work in a simpler situation where you know the answer, why do you think it will work in the real world?

    I don’t see where they offered a “known situation” independent of more complicated modeling. They did claim a peak 2W/m2 ocean flux during Pinatubo from data (levitus), but I don’t see where they got that number. The best I could do from the data in levitus was to calculate the average flux over the past 50 years (or whatever the length of the data series is) and I got 0.2W/m2. DK say they get 1W/m2 – I assume they mean during Pinatubo. In reality we need the flux due to the change in temperature from the eruption, which would require more than just reading from a graph, and DK’s derivation from first principles seems sound.

    given output from a model that is exactly the same as the model he is using, his methodology is not robust. i.e. it does not derive the coefficients (i.e. the timescale or the sensitivity) in an statistically unbiased way.

    That doesn’t necessarily invalidate the method. Sometimes biased estimates are better (have lower variance). Eg, the least variance estimator for the mean of a multi-dimensional gaussian in more than two dimensions is biased (the James-Stein estimator). So is the usual formula for sample variance. Nevertheless, it’s an interesting point. And right up my alley. I will look into it. Thanks.

    Comment by mugwump — 11 Sep 2008 @ 7:04 PM

  223. We know that the measured temperature increase has not kept pace with the predictions expected from a climate sensitivity of 2.0C to 4.5C per doubling.

    An increase of 0.7C since CO2 started increasing points to a figure of 1.5C or so. The deep paleoclimate temperature and CO2 estimates similarily point to a sensitivity figure of 1.5C or so.

    Gavin states that there is a lag in the temperature response, that the oceans are absorbing some (half perhaps) of this temperature response.

    How can we tell that the oceans ARE absorbing some of the expected increase? Why we would not expect this to be permanent storage mechanism? How long is the lag before the oceans permanently release this energy so that the surface and the atmosphere heat up according to the sensitivity range of 2.0C to 4.5C per doubling?

    [Response: The warming in the oceans was demonstrated most recently by Domingues et al., 2008 (discussed here). The oceans aren't going to release any of that energy. Instead, they just need time to warm up. Once they are warm the atmosphere will warm until the radiation is in balance again (of course, all these things are happening together). But the heat going into the ocean (really the deep ocean) just keeps the surface cooler than it would otherwise be. - gavin ]

    Comment by John Lang — 11 Sep 2008 @ 7:54 PM

  224. As to the consensus, where has anyone imposed by fiat or any other method that climate scientists only consider sensitivity more than 2 degrees per doubling?

    No one has Ray. But you apparently wanted to. There are plenty of respected scientists that do not buy into your consensus.

    As for the motivation behind using a simpler model (actually, I would call them “direct approaches” – the idea is to finesse the use of models altogether): it’s clear we’re not going to get a good bound on climate sensitivity from GCMs anytime soon. The current best guess is not really any different from Charney’s. So it makes sense to look for other ways to constrain the sensitivity. You might want to read the Schwartz paper I linked to above. He puts the case very well.

    Comment by mugwump — 11 Sep 2008 @ 8:05 PM

  225. Richard Pauli, # 221.

    That is a beautiful comment.

    You prompted me to muse on some of the less eloquent outpourings here and therefore on what a powerful intoxicant grandiosity can be. It has no doubt been a steadfast driver for more than a few students throughout their school years (and beyond if you believe the claims). You also reminded me that barring some organic disability, social grace and self-examination are forms of intelligence that can be developed with a little effort without regard to age or background.

    Comment by Jonick Radge — 11 Sep 2008 @ 10:43 PM

  226. I was looking for errors in:
    http://www.weatherquestions.com/Climate-Sensitivity-Holy-Grail.htm
    Did I do a good job here?: http://blogs.abcnews.com/scienceandsociety/2008/09/nature-is-not-a.html?cid=130321742#comment-130321742
    (start just after: “Sep 11, 2008 2:47:53 PM”)

    Are there other points to be made about that?
    Is there a good example where the kind of data/comparison Spencer is using is used correctly?

    Comment by Patrick 027 — 11 Sep 2008 @ 11:54 PM

  227. It would be great if you folks would help me explain the flaws in skeptic Richard Lindzen’s paper “Taking Greenhouse Warming Seriously” which can be found here: http://www-eaps.mit.edu/faculty/lindzen/230_TakingGr.pdf

    He uses 4 leading GCMs to calculate the equilibrium temperature distribution in the atmosphere with respect to both latitude and altitude that would result from 2X CO2 increase. The temperature peaks at the equator at an altitude corresponding to about 200 mbar. He then compares the difference between this peak and the surface temperature rise to the weak temperature change with altitude that has been actually observed. He then somehow concludes from this that only 1/3 of observed GW can be attributed to the CO2 rise. He states:
    “Contrary to the iconic statement of the latest IPCC Summary for
    Policymakers, this is only on the order of a third of the observed trend at the
    surface, and suggests a warming of about 0.4° over a century. It should be added
    that this is a bound more than an estimate.”

    I know there was trouble at first with this altitude data but he seems to be using the corrected data. How can he say it is a bound?

    [Response: With all due respect to Prof. Lindzen, that argument is a crock. He knows full well that the expected amplification with height is expected purely from the moist adiabat responding to any temperature change at the surface (as we have discussed). He should also be aware that the tropospheric temperature records in the tropics are rather uncertain, and different methodologies give very widely varying trends - the most recent of which match the expected pattern rather well. - gavin]

    Comment by Thomas Bleakney — 12 Sep 2008 @ 12:42 AM

  228. Rod writes:

    Gavin, if one added one more blanket to the 300-400,000 (relative Venus to Earth CO2??) already on how much warmer would one get? Are you already more than 98.6? How many blankets did that take? If you double CO2 on Venus, what would the temperature be? (Serious curiosity question.) As an aside, how did Venus, at 0.85 the mass of Earth, get about 100bars of pure CO2 anyway?

    Venus is believed to have undergone a runaway greenhouse effect early in its geological history. The carbon dioxide that, on Earth, is in carbonate rocks, is in the atmosphere on Venus. Add our carbonate rock CO2 to the atmosphere and we’d have something like 60 bars of it.

    According to Bullock and Grinspoon, who I think are the foremost Venus atmosphere modelers at the moment, Venus may undergo periods where the surface temperature is up to 900 K, not the present 735 K.

    Comment by Barton Paul Levenson — 12 Sep 2008 @ 3:59 AM

  229. Thomas Hunter writes:

    Perhaps you need to re-read the history of the world since the fall of the Roman Empire

    Perhaps you need to stop assuming that people who disagree with you haven’t studied the subject.

    Comment by Barton Paul Levenson — 12 Sep 2008 @ 4:01 AM

  230. mugwump writes:

    No one has Ray. But you apparently wanted to. There are plenty of respected scientists that do not buy into your consensus.

    There are probably geologists who still don’t accept continental drift, and astronomers who still don’t accept the Big Bang. You can always find someone to agree with any position; what matters is how seriously they are taken by their colleagues.

    Comment by Barton Paul Levenson — 12 Sep 2008 @ 4:09 AM

  231. Mugwump: “There are plenty of respected scientists that do not buy into your consensus.”

    And what have they published in peer-reviewed journals of late. How many times has they been cited by others. Answer to both: Not much. Voila, scientific consensus.

    As to Schwartz, I did read the paper when it came out. I was unimpressed, since it greatly underestimates the complexity of warming in the oceans. At most it represents a lower bound on the sensitivity. Did it ever occur to you that the reason the central tendency of the “guess” of climate sensitivity is not changing is that it could be right?

    Comment by Ray Ladbury — 12 Sep 2008 @ 7:05 AM

  232. RE #230:

    And what have they published in peer-reviewed journals of late. How many times has they been cited by others. Answer to both: Not much. Voila, scientific consensus.

    I see a suprising number of papers published with lower sensitivities, given the consensus you speak of. Definitely in the minority, but non-negligible.

    As to Schwartz, I did read the paper when it came out. I was unimpressed, since it greatly underestimates the complexity of warming in the oceans.

    Seriously, there may be problems with Schwartz, and other papers of this ilk, but they are a worthy attempt to measure sensitivity without relying on models, and hence obtain a less uncertain answer.

    If you’re interested, check out Nicola Scafetta’s comment on Schwartz original paper, which has been absorbed into Schwartz’ latest version that I linked to above. It seems to show clear justification for 2 lags in the climate – one of a few months and one of around 8 years. I say “seems” because I am curious how sensitive the autocorrelation method is – I sure can’t see those lags by eyeballing the data so I am suspicious of this approach. I’ve been in this game too long not to trust my lyin’ eyes before I trust statistics or models.

    Did it ever occur to you that the reason the central tendency of the “guess” of climate sensitivity is not changing is that it could be right?

    The unchanging mean is not the problem. It’s the unchanging uncertainty. The range of plausible values has barely narrowed in 30 years leading to suggestions recently that we not even bother trying to estimate climate sensitivity (Allen and Frame).

    The policy implications of climate sensitivity and the mixing time are enormous. If it is 2C or less, or very slowly mixing (so it takes us many centuries to equilibriate), we probably have very little to worry about. If it is 4C or more and rapidly mixing (so we’ll see most of the warming in the next 100 years or so), we probably have a lot to worry about. Rather than throwing in the towel, we should be examining all available avenues for better bounding the climate sensitivity.

    Comment by mugwump — 12 Sep 2008 @ 8:36 AM

  233. Mugwump, you are dodging the second part of my criterion–how often do these papers get cited in subsequent published work. That is an excellent measure of the degree to which they advance the state of understanding. The problem with the approaches of Schwartz, Douglas, etc. is that they do not provide a way forward because their focus is so narrow, they make it virtually impossible to explain paleoclimatic data.
    What avenues are you suggesting that you think aren’t being pursued. The problem is that low sensitivity doesn’t explain the data given the known physics. How do we approach this without ignoring the physics?

    Comment by Ray Ladbury — 12 Sep 2008 @ 10:38 AM

  234. Ya know, Mug, you can look up these tired old talking points rather than raising each one as though you’d just thought of it for yourself or read it at the other place — use the search box at the top of the page.

    If you’d make the effort, read the prior discussion, give some sense you’re not just posting what seem like clever opposition statmements without bothering to read up on them, you could do better.

    Try Eric Raymond’s recommended approach? You’ll feel smarter immediately if you’ve made more effort before posting old ideas:

    How To Ask Questions The Smart Way:
    http://www.catb.org/~esr/faqs/smart-questions.html

    Comment by Hank Roberts — 12 Sep 2008 @ 10:48 AM

  235. Mugwump, you are dodging the second part of my criterion–how often do these papers get cited in subsequent published work.

    Ray, they get cited. That’s how I find them.

    The problem with the approaches of Schwartz, Douglas, etc. is that they do not provide a way forward because their focus is so narrow, they make it virtually impossible to explain paleoclimatic data.

    They do provide a way forward: it’s just a different way of looking at the problem. Maybe it will turn out to be a dead-end, but after 30 years of failing to narrow the uncertainty, it ain’t like GCMs are exactly pointing the way forwards either.

    Why are you so desperate to rule out this novel avenue of attack? Why not just see where it leads? I am not suggesting we drop GCMs to focus on direct approaches to estimating the sensitivity.

    What avenues are you suggesting that you think aren’t being pursued.

    I am suggesting that if we just want to know what climate sensitivity is, maybe there’s an easier way to get that than modeling the entire climate. Don’t solve a harder problem than you need to. [I am not claiming any novelty here - obviously people have been thinking along these lines for 100 years or more.]

    Comment by mugwump — 12 Sep 2008 @ 11:52 AM

  236. RE: #178, 205

    Patrick, I have books and I have papers; several books and many many papers. To know what makes up a specific model we need the continuous equations for that model. The same goes for the discrete approximations, numerical solution methods, and the actual coding.

    Not even addressing the parameterizations of sub-grid processes, it is highly unlikely that any of the codes utilize the basic fundamental equations of fluid motions; let’s call them the Navier-Stokes equations. For one thing, the very difficult problem of turbulence must be addressed. For another, the spatial resolution used in the calculations cannot begin to resolve the gradients of driving potentials for mass, momentum, and energy exchanges both within the solution domain and at its boundaries. For a third, the extremely difficult issues associated with multi-phase flows must be addressed. The list goes on and on.

    Thus all the codes utilize model equations developed from the basic equations by adopting Idealizations, appropriate assumptions, and associated approximations. SImply pointing to a book written about the Navier-Stokes equations as a source of information for what is used in a specific model and code is not a correct specification of the answer.

    Let’s take as an example the momentum balance model for the vertical (radial) direction for the atmosphere. The number of possible formulations for this single equation is somewhat large. Consider the following possibilities:

    (1) no equation at all
    (2) an equation expressing the hydro-static balance between the pressure gradient and the gravitational force
    (3) a form in which only a few terms have been dropped from the fundamental formulation
    (4) the complete un-modified form of the fundamental statement of the momentum balance
    (5) various modifications applied to the above (as applicable) to include different approaches to modeling of turbulence.

    And this list is only a zeroth-order cut and I’m sure many others can be listed.

    Why are the actual continuous equations so important, beyond providing an indication of what phenomena and processes can and cannot be described. The system of PDEs plus ODEs contains critical information relative to the characteristics of the model equations, well-posed or ill-posed, boundary condition specifications ( where, how many, and what), propagation of information within the solution domain and at the boundaries, and the proper approach for solving the discrete approximations to the continuous equations. Ad hoc specification of boundary conditions based solely on the discrete approximations is a well-known source of difficulties in numerical solution mehtods, for one example. The correct representation of discrete approximation for integrals of div and curl in non-orthogonal coordinate systems, for another. Some model equation systems for the basic hydrodynamics of atmospheric fluid flows are known to not be well-posed, as another. There are many other critical aspects that are set by the system of continuous equations.

    I and several others have attempted to find in the published literature a summary of the actual final form of the continuous equations used in, for example, the GISS/NASA ModelE model and code. None of us have been successful. We have been directed to the papers cited on the ModelE Web pages. The information is not in those papers. Recently, Gavin Schmidt directed me to a paper from 1983. The vertical-direction momentum balance model in that paper is the hydro-static version listed as (2) above.

    So, somewhere between papers published in 1983 and those on the ModelE Web pages published in 2006 there might be a specific statement of the equation for the vertical momentum balance model that is actually used in the GISS/NASA ModelE code. We have been told several times that it’s there, yet we can’t be directed to the specific paper, page number, and equation.

    Several people have attempted to find that specific statement and none have been successful. I hope you will accept a challenge and start a search for that equation.

    Comment by Dan Hughes — 12 Sep 2008 @ 12:05 PM

  237. Ya know, Mug

    Ya know Hanky babes, when someone starts a post with a deliberately dismissive diminutive (I wish I could say I though of that wonderful alliteration, but it belongs to William Connolley) of my nom de plume, followed by an ill-informed rant, I usually don’t bother responding.

    Comment by mugwump — 12 Sep 2008 @ 12:05 PM

  238. “To get reliable predictions for climate on regions as small as a US state will take more computing power and understanding of climate processes than we have at present.” Now if we hadn’t spent all that money on the super collider and studying grizzly bears in Montana we could get to working on this.

    Jeff Davis is correct on this problem however. When we start by telling the common person and non-climate scientists(~99.9999% of voters) that they are too ill-informed to understand the problem, but they need to give up their job (or hope of being the first Hawaiian to become POTUS), then it is back to drill, drill, drill, drive, drive, burn, burn, burn.

    Comment by neil pelkey — 12 Sep 2008 @ 12:38 PM

  239. > the unchanging uncertainty
    How many models have incorporated plankton ecology, so far?

    We know many areas in which we don’t yet know the magnitude of the forcing, and others like plankton where we barely have begun to know how a complicated part of the system has been working, let alone how it may be changing. That’s known uncertainty and it’s not going to change until the research is done.

    Of course this uncertainty remains uncertain. This is not evidence of a problem. This is how science works.
    ______________
    ReCaptcha: is continuous

    Comment by Hank Roberts — 12 Sep 2008 @ 12:49 PM

  240. One for Gavin et al., relevant to simple answers –
    while looking up this earlier article here
    http://www.realclimate.org/index.php?p=160

    I came across this current presentation:
    http://www.debate-central.org/ask/corinne-le-quere

    The sponsor is
    http://www.sourcewatch.org/index.php?title=National_Center_for_Policy_Analysis The site is meant to provide information to high school debaters.

    I wonder how asking climate scientists questions via this site works.

    Comment by Hank Roberts — 12 Sep 2008 @ 12:58 PM

  241. Analogous to renormalization group methods used in statistical mechanics, isn’t it possible to somehow “integrate out” all these hundreds of effects that are important but in which you are not interested in and obtain a model in which you only have the physical quantities of interest (like the average temperature) plus a whole bunch of effective constants that do not exist at the level of the fundamental equations?

    Comment by Count Iblis — 12 Sep 2008 @ 1:19 PM

  242. Ah, Gavin, yes. I merely mentioned the volcanoes because they are a factor, not because it was a large or main factor. Of course, it is the CO2, although I myself would put it more so as the CO2 being unable to escape or be absorbed. On a related note, do you have any opinion on how the behavior of CO2 and the rest as a supercritical fluid at the bottom 5% or so of the roughly 60 KM Venusian troposphere?

    I am aware that it was a rhetorical device, Jim. A very impossible what-if line of questioning not really worth considering. Might as well ask what if the Roman Empire had never fallen, then taken over the entire planet and destroying all civilizations not going along with them using neutron and particle-beam weapons, and then implementing a commune-like utopia for the surviors. As in something like they’d developed and built supercomputers by 900 and cold fusion devices by 1200. Make up some other stuff about the economics, politics, religion, social structures and history along the way.

    I wasn’t aware you were disagreeing with me or I with you, Barton. Nor am I accusing you of being unaware of the subject. I’m simply saying, history being what it is, we have the current level of tecnology and sources of fuels that we have. Sure, it would be nice to have giant satellite mirrors thingys beaming all our energy supplies from space, super-efficient wind turbines, hydrogen and fuel cell cars and the like. Unfortunate, but none of this stuff was able to be developed, implemented and refined a few hundred years ago. I am optimistic that current solar technologies will continue to come down in cost and new techonolgies developed, much like they are and have been doing, seemingly hand in hand with computer and TV flat panels.

    Comment by Thomas Hunter — 12 Sep 2008 @ 4:31 PM

  243. re: #230 BPL

    Hmmm, are there any serious geologists who don’t accept continental drift?
    (As you say, there must be, I just don’t know any names offhand.)

    On Big bang, for sure, Hoyle, and to this day, Halton Arp comes to mind, i.e. astronomer who has done great observational work, but got fixated on an idea decades ago and won’t let go, no matter what evidence piles up.

    If someone got exposed to this, and didn’t have the background to sort it out, they’d easily get convinced by this Caltech PhD that science is really broken, and almost everybody else is wrong. If you can find a copy of “Seeing Red” in a library, try it out, as a good exercise in critical thinking,and the photos are beautiful.

    Obviously, for climate parallels, this is more like “It’s all cosmic rays” than like “Anything but CO2″, i.e., someone seems to have a strong belief in a particular idea that happens to conflict with the mainstream, as opposed to not wanting the mainstream, and being happy to take anything else.

    Comment by John Mashey — 12 Sep 2008 @ 4:33 PM

  244. Is there any long term data regarding the behavior of the night sky radiation effect?

    For those who do not know, the night sky radiation effect describes the process when a surface exposed to the night sky cools below the surrounding temperature.

    In itself, a trend towards reduction in the temperature differential would be pretty close to proof for increases in the Greenhouse effect.

    Comment by Lawrence McLean — 12 Sep 2008 @ 8:11 PM

  245. Mugwamp:

    “If it is 2C or less, or very slowly mixing (so it takes us many centuries to equilibriate), we probably have very little to worry about.”

    If the last ten or twenty years of climate that we have experienced in Australia is truly indicitive of what the current degree of warming gives us, then there is indead a great deal to worry about.
    Southern Australian agricultural and natural systems are in a dire situation due to the ongoing drought conditions, and our scientists predict little reprieve.

    Comment by Craig Allen — 12 Sep 2008 @ 8:41 PM

  246. John Mashey and BPL, I had a prof. in grad school (fairly well known for his work in E&M) who didn’t believe in quarks and much of modern particle physics–he had his own theories. The department made it clear that he was to stick to the curriculum, although occasionally, he’d give a “special” lecture on his own special brand of “particle physics”. In all the time I followed goings on in the department, he attracted one student, who subsequently went nowhere. Nobody was interested in what he produced because it didn’t advance the state of understanding–that’s consensus at work.

    Comment by Ray Ladbury — 12 Sep 2008 @ 9:12 PM

  247. I had a prof. in grad school … who didn’t believe in quarks

    Nobody was interested in what he produced because it didn’t advance the state of understanding–that’s consensus at work.

    I had an undergrad prof who designed “aura” detectors. No kidding.

    That’s tenure at work.

    I have a serious question for gavin. What’s the justification for linear climate sensitivity? By that I mean: if sensitivity from LGM to the Holocene is, say, 1K/W/m2, would we expect it to be the same today even though the Earth looks very different (it’s a lot hotter and darker for starters, with very different precipitation patterns). Or are the differences from such changes small enough that we can assume linearity?

    Comment by mugwump — 12 Sep 2008 @ 9:48 PM

  248. Hank (210), you are using the term “heat” too loosely. “Heat” is pretty much a nebulous term and can mean different things in different contexts and by different folks. I’m using it in the context that heat does not equal energy (meaning not precisely the same thing). Photons add energy to the (GH)gasses. The atmosphere gets “heated” when that absorbed radiative energy gets collisionally transferred. This aside, I think I understand and agree with all you say in 210.

    Comment by Rod B — 12 Sep 2008 @ 11:47 PM

  249. Ray (213), a quick observation: I think your definition of consensus is way to complicated, nor do I think consensus entails a prohibition on any questioning.

    Comment by Rod B — 13 Sep 2008 @ 12:00 AM

  250. Barton, I’m getting OT but I appreciate the helpful info re Venus. Why didn’t Venus’ CO2 get sequestered into carbonate rock?

    Comment by Rod B — 13 Sep 2008 @ 12:24 AM

  251. mugwump writes:

    I am suggesting that if we just want to know what climate sensitivity is, maybe there’s an easier way to get that than modeling the entire climate.

    Yo! Mugwump! For about the fifth time, the paleoclimate data doesn’t depend on GCMs, and it gives a climate sensitivity around 3 K per doubling of CO2! For Lindzen’s estimate to be right, you have to explain not only why the models are wrong, but why the paleoclimate data are also wrong. Good luck with that.

    Comment by Barton Paul Levenson — 13 Sep 2008 @ 5:15 AM

  252. Dan Hughes writes:

    it is highly unlikely that any of the codes utilize the basic fundamental equations of fluid motions; let’s call them the Navier-Stokes equations.

    Actually, I think they do.

    [Response: In the free troposphere, they actually use what are called the primitive equations which don't include viscosity (since that term is very small). But Dan knows all this. - gavin]

    Comment by Barton Paul Levenson — 13 Sep 2008 @ 5:19 AM

  253. Lawrence McLean writes:

    Is there any long term data regarding the behavior of the night sky radiation effect?

    For those who do not know, the night sky radiation effect describes the process when a surface exposed to the night sky cools below the surrounding temperature.

    In itself, a trend towards reduction in the temperature differential would be pretty close to proof for increases in the Greenhouse effect.

    The diurnal temperature difference has decreased. Nighttime temperatures are rising faster than daytime — one more bit of proof that it’s greenhouse gases and not the sun.

    Comment by Barton Paul Levenson — 13 Sep 2008 @ 5:23 AM

  254. Gavin chooses to reply to #56 and #57, but not #54:

    “I know how easy it is to overfit when you snoop the test data. In fact, we don’t consider a model validated until we’ve tested it against completely unseen data. Climate modelers have spent years tweaking heavily parameterized models against a very limited set of data. They are almost guaranteed to have overfit.”

    The true test of overfit is: do we see evidence of a divergence over time between predicted and observed? Consider:

    -When tropical tropospheric temperatures are observed to not fit the model’s predicitons it’s suggested that it’s because windshear was neglected and the tropical data are sparse.
    -When global mean temperature fails to rise from 2001-2008 it’s because the capacity of the oceans to absorb that heat was underestimated.

    I’m not saying all the ad hoc revisionism isn’t justified. Just that I wish it weren’t so necessary. It looks bad. For every divergence, there’s an easy fix. When you assert the physics are “known”, and then are forced to make these post-hoc adjustments to the models, it only adds to the weight of #54.

    Maybe it is better to just admit that the physics are not fully “known”, that many parameters are necessarily guessed at?

    Comment by Richard Sycamore — 13 Sep 2008 @ 7:02 AM

  255. Rod, Venus:
    http://www.google.com/search?q=pierrehumbert+venus+atmosphere

    Comment by Hank Roberts — 13 Sep 2008 @ 8:35 AM

  256. John Mashey (#237),

    The value of the big bang theory is that it is fruitful. Hoyle, who has passed away now, was able to present explanations for the evidence that is often taken in support of the theory so he was certainly not rejecting evidence. He was rejecting the big bang on philosophical grounds, grounds that have been very useful in the development of philosophy. Those who feel that existance implies a creator are in some sense anti-copernican. The way for Hoyle to exclude the need for a creator and preserve the very useful vantage that we exist in no special circumstance that Hoyle preferred was to have the universe have no beginning and thus no creator. The fruitfulness of the big bang theory, that it made testable predictions which were supported by subsequent observation, was not persuasive for Hoyle because the big bang semed like it implied magic and thus was not science.

    Hoyle was mistaken there. While it could well be the case that God requires science of us, science cannot require God of the universe no matter how the science is framed. The big bang theory cannot be proof of the existance of a creator because science seeks natural explanations. So, Hoyle’s fundemental error, one shared with Newton and perhaps Einstein was the idea that the existance of a creator could be inferred from His work through science. Had he realized that neither big bang nor steady state theories could speak to the issue and still be science then he might have come down another way. After all, his work had shown that Democritus’ atoms were not, in general, eternal but rather build up in stars through a subtle resonance he discovered in the nuclear physics of carbon.

    While the field of cosmology today is becoming more quanitative, its orgins are deeply entwined with philosphical (pure reason) enquiries and Hoyle’s approach seems tied to this. Anthropogenic global warming deniers have a more difficult task since they need to deny long settled physics to make their case. They thus tend to lack the intellectual rigor or honesty or both that Hoyle required of himself. Comparing them to Hoyle seems an insult to the man.

    Comment by Chris Dudley — 13 Sep 2008 @ 9:40 AM

  257. If you are interested in climate change, you are invited to contribute to

    http://beautifulplanetearth.wikispaces.com/

    Together we want to come up with a coherent theory of climate change incorporating all scientific findings concerning the issue.

    Comment by AdvocateOfLife — 13 Sep 2008 @ 9:58 AM

  258. Barton (251) your certainty of conclusion significantly exceeds the certainty of the paleoclimate data. I’m not saying you’re wrong, but that you can’t totally exclude other conclusions with a wave of the hand.

    Comment by Rod B — 13 Sep 2008 @ 10:08 AM

  259. When tropical tropospheric temperatures are observed to not fit the model’s predicitons it’s suggested that it’s because windshear was neglected and the tropical data are sparse.

    Well, both are true. That’s not “revisionism” – the lack of data’s been known forever, and that which exists known to have problems.

    You’re suggesting that a model’s failing to fit bad data is a problem with the model? How quaint.

    -When global mean temperature fails to rise from 2001-2008 it’s because the capacity of the oceans to absorb that heat was underestimated.

    No one serious says this. Congrats on building and burning that strawman.

    You need to read this. Seriously. And also go bone up on noise vs. signal, etc.

    Comment by dhogaza — 13 Sep 2008 @ 10:43 AM

  260. Rod B. How do you have “heat” without having “energy”?

    WRT scientific consensus–this is based on my experience. I know from my experience and my study of science and history of science that consensus is real and important. I also know that scientists rarely if ever really talk about it or talk about what there is and is not consensus on. In effect, they vote by deciding what is and is not worth their effort–what has the best chance of advancing their career, and so the science.

    Oracle of ReCAPTCHA: stress police–must be a CA thing.

    Comment by Ray Ladbury — 13 Sep 2008 @ 10:47 AM

  261. Hank (251), thanks for the links. It still seems funny (actually meaning I don’t yet fully understand and need to do more study; anyway….). One Pierrehumbert article says CO2 (then dry ice) came first (on Mars) causing the surface to heat enough to support water. He implies it was first water vapor on Venus that got things near the Kombayashi–Ingersoll limit which created the near runaway temp increase which in turn destroyed the water vapor (exactly how I haven’t gleaned yet) which then cut out the CO2 sequestering through weathering. CO2 increases were last in the chain — a result of runaway warming, not a cause.

    Comment by Rod B — 13 Sep 2008 @ 11:11 AM

  262. Chris D (256): Let’s see. AGW sceptics lack intellectual vigor and honesty because we “just believe” something other from the consensus, as opposed to Hoyle, who (presumably) had intellectual vigor and honesty because he philosophically believed something other than the consensus. One must contemplate this a bit….

    The original contention was that there are respected scientists who have disagreed with accepted dogma. Hoyle viz-a-viz the Big Bang (though as I understand he coined that term) a salient example. (He still may be correct. The fact that a theory, like Big Bang, is helpful and fruitful doesn’t ipso facto make it true.)

    Comment by Rod B — 13 Sep 2008 @ 11:35 AM

  263. Ray, true, you can not have “heat” without “energy”. But you can have “energy” without “heat.”

    I fully agree with your assessment of “consensus” in #260. Doesn’t mean that it is automatically correct. And shouldn’t be cause for prohibiting contrary views, which you implied in #213 though backed off from in #219.

    Comment by Rod B — 13 Sep 2008 @ 11:46 AM

  264. BPL@251,

    For about the fifth time, the paleoclimate data doesn’t depend on GCMs, and it gives a climate sensitivity around 3 K per doubling of CO2! For Lindzen’s estimate to be right, you have to explain not only why the models are wrong, but why the paleoclimate data are also wrong. Good luck with that.

    I will note this passage from Dr. Rind,

    “Ironically, when we look at paleoclimates, both the cold climates of the Ice Ages, and the warm climates of the Tertiary (from 65 to 2.5 million years ago), the same uncertainty exists. We do not know how cold/warm the tropics were at these times, nor can we properly reproduce the high latitude responses from these climates in our models. The tropical paleoclimate proxies are conflicting and may be misinterpreted; the high latitude responses may be arising under different circumstances. So we cannot use paleo-observations to determine which, if any, of our models has the proper sensitivity in these regions — and in fact, models cannot reproduce what at face value seem to be the extreme changes in low-to-high latitudinal gradients suggested by paleo-data.”

    [Response: Note that he is talking about latitudinal gradients and of periods for which forcings and responses are ill-defined. None of that is relevant to the LGM - and that's the paleo period that simply isn't consistent with a negligible sensitivity. (You might care to note that Rind and Peteet (1985) was one of the key papers demonstrating that tropical ocean temperatures must have indeed changed substantially at the LGM). - gavin]

    Comment by Ellis — 13 Sep 2008 @ 12:09 PM

  265. Mugwump:

    “I usually don’t bother responding.”

    Yes you do. This response tells me at least more than I want to know.

    “There’s no way of saying this without sounding obnoxious, but since you showed me yours, I’ll show you mine.”

    Ugh! You’re right. But if you’re going to open that door you might as well commit to it and give your name so that we can begin to examine your credentials in detail. Otherwise, please lay off the privates.

    “…Al Gore (and the alarmist industry in general)…”
    “…Unfortunately, those pushing the alarmist agenda are using the higher sensitivity estimates from the models to further their political goals…”
    “…I don’t believe in this idea of ‘consensus science’…”
    “…invoked to justify all kinds of crazy alarmist nonsense…”
    “…concerning his unrequited love affair with Al Gore…”
    “..There are plenty of respected scientists that do not buy into your consensus…”

    Well OK. Since a theme here seems to be how professions outside of the climate community approach the science, I can’t help wondering about the language. This is rhetoric straight from the stocks of partisan hacks. Why would anyone seriously trying to pass as a iconoclast lean on this kind of down scale (‘down scale’ for this forum) verbiage?

    There are all kinds of activity embedded in these phrases, including a fuzzing of the boundary issues between science, policy making, and politics. If you’re concerned about ‘firewall’ issues, then why not discuss them openly without all the thinly veiled condescension? It is ironic since this is precisely the kind of framing language that’s designed to provide cover for the nihilistic corrosion of systems informing public policy.

    Just some general thoughts about studying natural science for whomever.

    You need to be firing on all cylinders to begin to grasp the breadth of any of the earth sciences. While analysis is certainly the bedrock, strong visualization skills are also required to properly contextuallize information about large natural systems — literally to keep odd scales in perspective, proportion, and interconnected. It’s a little like the medieval scholars who performed prodigious feats of memory by visualizing a castle and carefully stocking it with information.

    Want to test yourself? For inspiration try out this extreme exercise: Some of the old masters’ knowledge of anatomy was so strong that they could, with great veracity and subtlety, draw the human form in any position without using a model. Most of us can barely manage stick figures. Yet stick figure science is the stock and trade of so many skeptics, a hodgepodge of poorly related, highly embellished elements.

    Climate science is big science. It’s a group effort. People skills count. Being brilliant is a necessary but, after a point, not a sufficient condition if you’re too prone to being a destructive influence.

    Comment by Jonick Radge — 13 Sep 2008 @ 12:58 PM

  266. BPL #251:

    Yo! Mugwump! For about the fifth time, the paleoclimate data doesn’t depend on GCMs, and it gives a climate sensitivity around 3 K per doubling of CO2

    In which case the best estimate for sensitivity should be 3 +- a small number. It’s not, so the certainty you express is not a “consensus” view.

    As for the paleo stuff, I asked this question above but it bears repeating: what’s the justification for assuming the sensitivity is linear?

    To elaborate: the paleo results are roughly that we saw a 6K rise for a total 6W/m2 increase in forcing from the LGM to the Holocene, therefore climate sensitivity is 1K/W/m2 (= 6K / 6W/m2). But the climate today is very different from the climate at the last LGM. We would not necessarily expect a 1W/m2 increase in forcing to have the same impact today as it had back then. Or in mathematical terms, climate sensitivity may be a nonlinear function of forcing and temperature.

    To elaborate even more, consider a completely hypothetical case: that of a snowball earth well below the “thaw” point. Assume an increase of 1W/m2 of forcing would not be sufficient to melt the ice, and hence there would be no albedo feedback. Maybe a snowball earth has very little cloud to speak of either, so the cloud feedbacks are negligible. Since the oceans are mostly covered with ice, increasing their temperature is not going to result in much CO2 outgassing, so we won’t see a CO2 feedback either. So the snowball earth may have a climate sensitivity close to the no-feedback case of about 0.3K/W/m2.

    Now, what if the Earth gets close enough to the sun to increase forcing above the snowball earth thaw limit? Maybe that’s as little as 3W/m2. All the feedbacks will start to come into play, and we’ll see a rapid (in geological terms) temperature rise. If the temperature rises 6K (remember, this is all hypothetical), and we attribute an extra 3W/m2 forcing to the albedo and CO2 changes (and whatever else – aerosols etc) then using the paleo sensitivity argument we’d say the climate sensitivity is 1K/W/m2 ( 6K / [3W/m2 + 3W/m2] ).

    Finally, once the earth has heated up again, cloud dynamics may yield negative feedbacks, which bring the sensitivity back down again.

    So depending on the state of the earth, sensitivity to a given amount of forcing may vary. The sensitivity you get by looking at changes from the LGM to the present may not be the same as the climate sensitivity due to increased CO2 today. Or at least if we think it is, I am curious as to why.

    Comment by mugwump — 13 Sep 2008 @ 1:59 PM

  267. RE #265:

    Why would anyone seriously trying to pass as a iconoclast lean on this kind of down scale (’down scale’ for this forum) verbiage?

    A) it’s no more downscale than the verbiage used to describe sceptics;

    B) it’s true: high sensitivities are used to push alarmist agendas;

    C) I don’t lean on it.

    Comment by mugwump — 13 Sep 2008 @ 2:17 PM

  268. gavin: “…the LGM. Explain that away using a sensitivity of less than 2 C and then we can talk.”

    As was mentioned in 247, how do we know climate sensitivity is the same now as during the LGM?

    Would we not expect the amount of albedo feedback to have been greater when snow/ice were at lower latitudes? Maybe the LGM value should be considered an upper bound?

    Comment by Steve Reynolds — 13 Sep 2008 @ 2:53 PM

  269. JR,

    not a sufficient condition if you’re too prone to being a destructive influence

    What exactly do you mean by this?

    Comment by Dave Andrews — 13 Sep 2008 @ 3:07 PM

  270. Would we not expect the amount of albedo feedback to have been greater when snow/ice were at lower latitudes?

    Obviously, the scientific community has never thought about this. As evidence of their idiocy I submit this abstract:

    The contributions of expanded continental ice, reduced atmospheric CO2, and changes in land albedo to the maintenance of the climate of the last glacial maximum (LGM) are examined…

    You could’ve spent 30 seconds in google scholar yourself.

    Then I’d could’ve spent that 30 seconds doing something else.

    Mugwump’s thinking is equally original, new to the scientific community …

    Comment by dhogaza — 13 Sep 2008 @ 3:13 PM

  271. 267:

    “A) it’s no more downscale than the verbiage used to describe sceptics;”

    My point is it’s stock — as in hackneyed. Why would YOU stoop to using language that works against your own credibility? And your answer seems to be that if you thought Al Gore set his hair on fire and jumped off a bridge, you’d want to do it too.

    “B) it’s true: high sensitivities are used to push alarmist agendas;”

    The key phrase here is “alarmist agendas.” That is purely a political judgment, and since it’s left hanging there, it doesn’t begin to address the way policy actually gets formed. You could have as easily said, “It’s true: low sensitivities are used to push dismissive, industry agendas.” You’d then no doubt be spending your time here trying to finagle higher sensitivities out of the discussion.

    “C) I don’t lean on it.”
    You don’t need to lean on it.

    Comment by Jonick Radge — 13 Sep 2008 @ 5:25 PM

  272. DA,

    Sorry. That was poorly worded. I was referring to trolling type behaviors mostly. There are probably quite a number things that cross the line of what’s tolerable. Though it’s an extreme example, I’m not sure I’d want to work with someone who was prone to violence, for instance. It’s a judgment call and depends on the situation. You can see it at work here in the way editors handle comments.

    Comment by Jonick Radge — 13 Sep 2008 @ 5:50 PM

  273. RE #270:

    It looks like that paper models the LGM with a (very old) GCM. All well and good, but this discussion is about GCM-free climate sensitivity estimates.

    Comment by mugwump — 13 Sep 2008 @ 5:50 PM

  274. There have been several comments on the lines of how do we know the climate sensitivity is the same as in the past?

    The GCMs give sensitivities of the form:
    change in temperature is proportional to change in logarithm of CO2 concentration.

    Now if we vary the CO2 concentration over a wide enough range we expect that this relationship will eventually break down. But we are not varying it over a wide range. We do expect sensitivity to have this form over the ranges of CO2 concentration that we have to deal with. We have no reason to expect an discontinuities in the functional relationship between temperature and concentration. We know of nothing that could cause such a discontinuity and we have no evidence of it occurring.

    Now could the constant of proportionality above be different in the past? Well, with a different continental configuration and different vegetation I expect that it would be somewhat different though I don’t know by how much. However we are making comparisons with the Pleistocene when we had a continental configuration and vegetation similar to today. There is no reason to expect more than a minor difference in the constant of proportionality.

    Could the constant of proportionality change with temperature and concentration? I expect that it does change. However I expect that it changes smoothly and if there is only a small change in temperature or concentration I would expect only a small change in the constant of proportionality.

    For sensitivities to be extremely different in the Pleistocene or in the expected future warmer period we would need the physical processes driving temperature changes to be different in ways that we have no reason to believe that they are.

    Comment by Lloyd Flack — 13 Sep 2008 @ 6:15 PM

  275. re: 256 Chris Colose

    “Anthropogenic global warming deniers have a more difficult task since they need to deny long settled physics to make their case. They thus tend to lack the intellectual rigor or honesty or both that Hoyle required of himself. Comparing them to Hoyle seems an insult to the man.”

    I’d said:
    “Obviously, for climate parallels, this is more like “It’s all cosmic rays” than like “Anything but CO2″, i.e., someone seems to have a strong belief in a particular idea that happens to conflict with the mainstream, as opposed to not wanting the mainstream, and being happy to take anything else.”

    I always thought Hoyle was a fine scientist, thought he might fairly have gotten a Nobel. As a kid, I enjoyed a few of his books.

    I thought my wording was clear, but I guess not.

    Even great scientists sometimes fasten on a hypothesis during that period when there are multiple competing hypotheses, and then spend the rest of their lives defending that hypothesis even as overpowering evidence mounts for one of the alternatives. That doesn’t mean they are wrong, but it’s a recognizable pattern. Of course, most scientists change their minds when the evidence has piled up, sooner or later.

    Although there is a vast difference in contribution, from my reading, Hoyle would seem modestly akin to Svensmark (cosmic rays do it)rather than Fred Singer (anything except CO2). Perhaps they’re different in that at one point, steady-state was a more plausible hypothesis than big-bang. I don’t think cosmic rays have ever been a strong candidate to explain GW, although it hasn’t been that long since it’s potential influence was bounded as low as it’s been.

    Put another way, Hoyle’s steady-state and big-bang are mutually contradictory.
    Claiming there is *some* influence from cosmic rays is not mutually contradictory to AGW, but claiming it’s the major influence is.

    Anyway, there was no insult intended to Hoyle.

    Comment by John Mashey — 13 Sep 2008 @ 9:04 PM

  276. It looks like that paper models the LGM with a (very old) GCM. All well and good, but this discussion is about GCM-free climate sensitivity estimates.

    I went and re-read the OP and see no evidence for your claim.

    And I went and re-read the post I responded to, and see no evidence that the question was asked in the context you suggest.

    Comment by dhogaza — 13 Sep 2008 @ 10:24 PM

  277. Rob #262,

    I used the term rigor in a fairly technical sense. Hoyle understood the big bang theory very well. Who better? His alternative theory attempted to explain the same observations.

    This situation differs from the situation with deniers. Those who deny CO2 has an effect, if they understood radiative transfer, would realize that they are barking up the wrong tree. That is not the sort of physics it makes much sense to challenge. But, though they don’t understand it, that is essentially their position: that there is something wrong with radiative transfer.

    There are also those who deny the data: that the temperature is increasing. These are even further from Hoyle.

    Fruitful theories gain support because they lead to deeper questions. Questions about grand unification in particle physics stem in part from consideration of the big bang theory as an example. The large hadron collider was recently commissioned to explore some of these questions. So, it remains worthwhile to continue to test the big bang theory whereas less fruitful theories don’t get much further testing. The big bang theory could be wrong in interesting ways while alternatives don’t lead to such a possibility.

    Comment by Chris Dudley — 13 Sep 2008 @ 10:58 PM

  278. Rod writes:

    Barton, I’m getting OT but I appreciate the helpful info re Venus. Why didn’t Venus’ CO2 get sequestered into carbonate rock?

    Venus, closer to the Sun than Earth, was hot enough for a runaway greenhouse effect to start. Water evaporated from the (hypothetical) early Venus ocean, water vapor is a greenhouse gas, more heat led to more evaporation, until the whole ocean was vaporized. Sunlight dissociated the water vapor, the hydrogen escaped, leaving the oxygen, which combined with the rocks. The temperatures were high enough to cook carbon dioxide out of the rocks.

    On Earth and Mars, the water vapor pressure curve isn’t high enough to go into a runaway state. Google “runaway greenhouse effect” Earth Venus Mars to see some web pages about this, complete with diagrams.

    Comment by Barton Paul Levenson — 14 Sep 2008 @ 3:52 AM

  279. Chris Dudley writes:

    Anthropogenic global warming deniers have a more difficult task since they need to deny long settled physics to make their case. They thus tend to lack the intellectual rigor or honesty or both that Hoyle required of himself. Comparing them to Hoyle seems an insult to the man.

    Hoyle was a great astrophysicist, at least in the ’40s and ’50s, and a pretty decent SF writer. But he was also a raving crackpot. In addition to denying the Big Bang long past when everybody else had had been forced to accept it, he also believed in creationist arguments about combinatorics and abiogenesis, that interstellar dust grains were bacteria, and that the flu was brought to Earth from space.

    Comment by Barton Paul Levenson — 14 Sep 2008 @ 3:57 AM

  280. Rod B #258.

    Additionally you can’t invoke remote possibilities with just handwaving arguments. That’s been used to “confirm” that the LHC is dangerous: we can’t KNOW that it won’t destroy the earth and all the inhabitants, so therefore it is too dangerous to start.

    So why are values outside the range of “reasonable based on limits of current measures and explanation” the REQUIRED resolution? And why only values outside those limits that prove YOUR point the only ones that should be considered?

    No hand waving, please.

    Hard science.

    Do as you demand of others. To do otherwise is hypocrisy.

    Comment by Mark — 14 Sep 2008 @ 4:36 AM

  281. Mugwump, what if instead of being 2 degrees or less (which someone came up with whilst ADMITTING they were ignoring some of the changes [so admitted missing out something that would increase the sensitivity] seen and you have taken as “true”), it is 6 degrees or more?

    Why is it with denialists, even when they are admitting GW, they pick and choose figures to the “we’re A-OK, people!” rather than the “ohshitohshitoshit we’re all going to die!” side? If the models are wrong, why are they wrong in a way that makes it OK for humans to continue? If measurements are wrong, why are they wrong in a way that means business as usual? If models are deficient, why do they only miss out effects that mean there’s no problem?

    Comment by Mark — 14 Sep 2008 @ 4:46 AM

  282. This seems frustrating. A few of us work the analysis of Peak Oil depletion and have been able to derive several of the empirically observed relations through relatively simple math derivations. These have been known for around 50 years, descrbed by King Hubbert using heuristics.

    Comment by wht — 14 Sep 2008 @ 5:58 AM

  283. Steve Reynolds #268:

    As was mentioned in 247, how do we know climate sensitivity is the same now as during the LGM?

    God of the Gaps, Steve. You’re on a hopeless errand.

    You see, the climate system doesn’t know where the warming came from. So, if it is from astronomical variations in Solar influx, or from CO2 keeping the warmth from escaping, it will respond in largely the same way.

    See the first picture in

    http://www.realclimate.org/index.php/archives/2007/12/tropical-troposphere-trends/

    It shows the response to either an increase in solar intensity, or in CO2 concentration. See how similar they are? (There is an important difference in the stratosphere; it knows CO2 when it sees it and uses it for radiative cooling. But this is inconsequential for the global energy balance.)

    Would we not expect the amount of albedo feedback to have been greater when snow/ice were at lower latitudes? Maybe the LGM value should be considered an upper bound?

    The doubling equilibrium sensitivity is defined by convention not to include ice sheet albedo, for good reasons. It is indeed highly nonlinear to temperature.

    The remaining sensitivity by this definition is assumed to be linear. Of course this assumption could be wrong — but nobody has presented a good reason that it would be… there are several independent observational lines of evidence pointing to a 3C sensitivity, and they are pretty unlikely to be all wrong.

    The Snowball Earth situation is so far out as to be irrelevant for the precise sensitivity of today. Almost like Venus…

    BTW what about the Eemian warming and sea level rise? Those temperatures were similar to our century, though astronomically forced. Why am I wrong to conclude that you cannot pump heat into the climate system by whatever means without getting your feet wet?

    Comment by Martin Vermeer — 14 Sep 2008 @ 7:36 AM

  284. RE #274 Lloyd Flack:

    The relationship between CO2 concentration and forcing is unlikely to change, since that depends only on basic physics. But that doesn’t necessarily mean the relationship between forcing and temperature change is constant, since that depends upon the state of the climate itself (see my hypothetical example in #266).

    Comment by mugwump — 14 Sep 2008 @ 7:52 AM

  285. Barton Paul Levenson (#279),

    I find Greenberg and coworkers’ EURECA-B sample of material that was irradiated on orbit to be a better laboratory analog to the insterstellar 3.4 um absorption feature than the Hoyle and Wickramasinghe proposed identification. In fact I use it in Figure 1 here: http://arxiv.org/abs/0802.1666 which should be out next month. On the other hand, investigating the concept of panspermia is something that receives a great deal of NASA attention and funding. The possibility that life on Earth was seeded from Mars or the other way around is under active investigation. Should we say that the Alpher-Bethe-Gamow paper was written by crackpots because it turned out that neutron capture can’t build all of the elements?

    Multiple independent origins of life seem to me to be more likely than galaxy-wide panspermia because highly ordered systems can arise spontaneously in physics. But, since we are working without even a second datum so far, it seems premature to rule out their view yet.

    The case that global warming deniers are crackpots or, often more likely, dishonest stooges, is not supported by calling peer reviewed science, such as that produced by Hoyle, crackpot.

    Comment by Chris Dudley — 14 Sep 2008 @ 8:01 AM

  286. Mugwump, #266: It would appear that your objection to the models (and presumably to the laws of physics, too) is that they constrain your creativity. Positing hypothetical situations unconstrained by any data or model (and yes models are an inherent part of science) may be fun, but it ain’t science.

    What evidence do you have that the climate sensitivity of the LGM is significantly different than that of the current climate. Volcanic eruptions provide a sensitivity in the same range, as do other sources of data. There aren’t too many forcers that persist on a timescale of decades to millennia other than CO2.

    Comment by Ray Ladbury — 14 Sep 2008 @ 9:17 AM

  287. RE Martin #283:

    The doubling equilibrium sensitivity is defined by convention not to include ice sheet albedo, for good reasons. It is indeed highly nonlinear to temperature.

    The paleo argument for doubling CO2 sensitivity is agnostic with respect to forcing source: you simply divide the temperature change by the sum of the forcings to get the expected temperature change per W/m2 of additional forcing. So in that sense it does include ice sheet albedo.

    This article is an easy to read overview. On page 147 there is mentioned the possibility that climate sensitivity may differ between the LGM and today, but that

    “our analysis minimizes that factor by specifying ice sheet area and atmospheric composition as boundary forcings”.

    I am unsure what that means, but hopefully the original source (Lorius et. al.) will illuminate that.

    It is also interesting to note that in the same paragraph the possibility of long-term random climate fluctuations is mentioned:

    “There are many uncertainties and limitations in paleoclimate analyses. For example, chaotic long-term fluctuations in ocean heat transport can contribute to observed global temperature change”

    I wonder how large the authors think those fluctuations can be? MWP anyone?

    Comment by mugwump — 14 Sep 2008 @ 10:53 AM

  288. Re #279 (BPL on Fred Hoyle)
    Hoyle also claimed that the best-known Archaeopteryx fossils (the London and Berlin specimens) were faked, the feathers being stuck on with glue. He ignored the fact that other specimens with feathers are known, and showed considerable ignorance of the processes of lithification. However, it does seem unfair that he did not get a Nobel for his work on stellar chemistry, when two close colleagues did.

    Comment by Nick Gotts — 14 Sep 2008 @ 11:25 AM

  289. Martin: “…the climate system doesn’t know where the warming came from.”

    I’m not sure why you state that in reply to me, since I wasn’t disputing it; also not sure why you want to bring religion (God of the Gaps) into this.

    “The doubling equilibrium sensitivity is defined by convention not to include ice sheet albedo…”

    I understand that it does not include the long term ice sheet effects, but that does not eliminate albedo effects from seasonal snow at different latitudes with different temperatures.

    As far as the Annan paper that you reference, it is interesting that the LGM evidence he uses has the same or higher probability for climate sensitivity of 1C as for 4.5C!

    Comment by Steve Reynolds — 14 Sep 2008 @ 12:11 PM

  290. Let’s try and make Spencer Weart’s point again, using a different system and a different model.

    For familiarity’s sake, lets pick a Boeing 747, which is actually a good deal less complicated than the global climate system. They are both similar in that they rely on a steady source of energy – fuel tanks for the Boeing, the Sun for the climate system.

    So, let’s say we build a complex computer model of a Boeing in flight, and we test that model using wind tunnels and model planes. We only work with the actual Boeing 747 – say a 747-8:

    http://www.boeing.com/commercial/747family/747-8_background.html

    Now, someone comes along and reduces the wing span by eight feet. Now, someone come up with a simple, logical, one-page mathematical proof that the airplane will still be able to take off. Or, find the maximum wing-reduction length that will still allow the airplane to take off from a normal runway. Give me a simple, logical answer that doesn’t involve complex computer models.

    One other thing – you only can extrapolate from the 747-8. You can’t go build a new physical model – a wind tunnel system – nor can you build a 747 with shorter wings and try and get it to takeoff by remote control. That meets the climate system limitations – we don’t have an identical planet with no fossil fuel CO2 emissions to compare ours to.

    Obviously, it is ludicrous to suggest that anyone could answer that Boeing question without having access to state-of-the-art supercomputer models. There is no simple “lift-feedback analysis” that “proves the effect of shortening the wings would be minor and ignorable.” Anyone making such a claim would be met with incredulous hilarity.

    However, all the fossil fuel PR lackeys on this thread would have you believe otherwise. It’s also interesting how they try and convert scientific issues into personal ones – as if it is the “expert” that matters most, not the logical scientific argument.

    See for example, my comment #175, in which I pointed out that the basic conclusions about global warming arrived at in the 1970s still hold up today, with Dave Andrew’s response in #186, where he claims that “the expert” is “skeptical.” He doesn’t quote the expert, he quotes one William Sweet who is talking about the expert… by the way, William Sweet is mostly trying to promote nuclear energy in his book, as “the only climate solution.” See http://www.nytimes.com/2006/04/26/opinion/26sweet.html

    That’s really very illustrative, as it shows how PR flacks are unable to use clear scientific arguments to back up their claims, but instead must rely on innuendo, personal attacks and the like.

    Comment by Ike Solem — 14 Sep 2008 @ 12:59 PM

  291. BPL, then doesn’t that make H2O the high temperature culprit (cause) with Venus, not CO2 as is often said?

    Comment by Rod B — 14 Sep 2008 @ 3:31 PM

  292. Mark (280) my only point was that I thought the vagaries of paleoclimate measurements (by proxy) are sufficient to make conclusions about that period less than absolute certain. I don’t fully comprehend your post; it seems to rebut something I didn’t say. Am I missing it?

    Comment by Rod B — 14 Sep 2008 @ 3:42 PM

  293. I believe that the climate sensitivity factor is non-linear by definition. To go back to an old cliche- a cinch by the inch, hard by the yard. Looking at Gavin’s “Learning from a simple model” post,the multiplication factor,under the heading Climate Sensitivity, is given as 0.25/sT^3 (where s is the sigma in Stefan’s law), which looks suspiciously like the inverse of the derivative of F(top)= LsT^4(where L is lamda, the emissivity).Therefore dT/dF=1/4sT^3.
    http://www.realclimate.org/index.php/archives/2007/04/learning-from-a-simple-model/

    The senisivity, defined in this manner, isn’t linear but is inversely proportional to T to the third power.This example is simplified and feedbacks aren’t taken into account but the basic definition, should be the the same unless I’m making some misintepretation.

    Comment by Lawrence Brown — 14 Sep 2008 @ 3:46 PM

  294. > http://beautifulplanetearth
    Bzzzt. Blogspamming?

    > Venus
    Rod, lots of ultraviolet, or a low gravity, both are ways a planet can early on lose much of its hydrogen. H2O gets broken up by ultraviolet into hydrogen and oxygen; hydrogen escapes rapidly into space. Thus Venus (closer to the sun, slightly lower gravity) early on lost much of its hydrogen, precluding longterm presence of much water. It’s in the links above.

    Comment by Hank Roberts — 14 Sep 2008 @ 6:34 PM

  295. One problem with using paleoclimate to derive figures for climate sensitivity is that this does not help us work out what will happen in the future. Assuming that system will respond in the same way as when warming out of an ice age seems to me to be unjustified. The response of the Arctic – the probable (near) future loss of summer sea ice – pushes the climate system into new territory – one we have very few analogues for in paleoclimate.

    Being confident that a tightly defined “sensitivity” falls into a certain range is not the same thing as saying that’s how the climate will respond in the coming century. We really are out into the unknown unknowns – at least until the GCMs can accurately reflect current changes in the Arctic, and we can quantify potential carbon cycle feedbacks.

    Comment by Gareth — 14 Sep 2008 @ 6:35 PM

  296. Re BPL in 253. Lawrence McLean said, in effect, that a decrease in the diurnal temperature range would pretty much verify an increase in AGW. You said that the range has been decreasing, with no further comment.

    I have been trying to follow up on that and my reading indicates that the GCMs predict an increase in the DTR. Maybe I have missed something. I too had always believed that a decrease in the DTR would indicate the presence of AGW, or at least nullify the idea that the source of the warming could be solar. So which is it – is a decrease in DTR an indication of AGW, or not?

    Comment by Ron Taylor — 14 Sep 2008 @ 9:23 PM

  297. Mugwump’s search for a one page derivation of climate sensitivity independent of the GCM’s is interesting in its Quixotic fervor. He has a little knowledge as demonstrated when he “showed us his. . . top undergraduate student in theoretical physics and pure mathematics at my university,” and he has read over 70 climate related peer reviewed papers. His questions are perceptive enough to get reasonable responses from Gavin and only a mild slap of the wrist when he gets a little too fervent.

    I thought it would be interesting to go to the source for the even handed estimation of climate sensitivity, James Annan. What should I find upon re-reading his 2006 blog entry but an exchange with our own mugwump that took place in the last few weeks.

    http://julesandjames.blogspot.com/2006/03/climate-sensitivity-is-3c.html

    The discussion seems to have terminated abruptly on September 6 after which the mugs took up his quest on Real Climate. Mugs refers to this exchange in #220. James was a little less than gentle. Perhaps that’s what Mugs was referring to above in #54 when he notes disdainful treatment from mainstream bloggers. Read the exchange and see if you think James was unduly harsh. Mugs says his last entry, which was deleted, clarified his position.

    On Sept 9 mugs was directed to the literature on LGM. Two hours later he was back with a diversion to a discussion of Douglass, the subject of his exchange with Annan. Two days later he was referred to Pierrehumbert’s on line text but we got a handwave about how hard it is to concentrate on a text book written for undergrads. Mugs notes that he hasn’t had time yet to look into the LGM. Since then we’ve been treated to more handwaving.

    Until he can address in some substantive way, the evidence of the LGM, perhaps he might desist in his hectoring the grownups.

    Comment by Paul Middents — 14 Sep 2008 @ 9:59 PM

  298. No model, mathematical, graphical conceptually can possibly take into account every variable and synergystic scenario.Still, more often than not, the IPCC predictions in mdedian range have proven very accurate with acceptable margins for error. Of oourse the 2007 does take into account new data and this is good science. We always want to be able to say where accurate predictions have been made and where there is room for improvement and when emprical data is assessed we must adjust accordingly.

    The Biological evidence for global warming and not just regional fluxuations is enormous at this time.

    Comment by Jacob Mack — 14 Sep 2008 @ 10:03 PM

  299. Nick Gotts #285,

    I wonder too if his strange love for nuclear power is not just a continuation of his disagreement with Sir Martin Ryle. Ryle presented source number counts for radio sources which suggested time evolution in radio source brightness or space density that did not fit well with the steady state theory. The argument appears to have gotten personal. Ryle argued against nuclear power on non-proliferation grounds. Hoyle seems to have taken the opposite position on nuclear power without any clear consideration of the practical difficulties involved in attempting to rely on it.

    Comment by Chris Dudley — 14 Sep 2008 @ 11:19 PM

  300. The difficulty in forcasting time-frames for tipping points to occur is that every new report that come to lights shows that we have les and less time than we originally thought. Case in point: the WWF is now calling this year the worst on record for the arctic in terms of melt area and the quality of ice remaining. I saw that coming months ago despite a relatively cool northwern hemisphere this summer. The IPCC scientists better factor in the worse case scenario for the arctic when it comes to predicting effects of CC as that will be much closer to the mark I think.
    The north atlantic current will begin slowing if it hasn’t already due to the lack of cooling ice and the warming of the ocean at the arctic latitudes throwing the entire world’s climate on it’s head. I cannot see a complete stopping of the current due to not enough transient volume of fresh water being released, but a definate slowing down I think is inevitable. How will a slower north atlantic current affect the world’s climate?? By 10%, 20% 30%??

    Comment by Lawrence Coleman — 15 Sep 2008 @ 2:30 AM

  301. Ike at #290:

    As with most aircraft, the lift will be approximately proportional to the wing area. So the effect of an the 8′ wingspan reduction is readily calculable, by hand. Your reduction is small, so it will certainly take off – but maybe not if the temperature is hot, the runway short, the load heavy, and one engine is down.

    The point is well made though. I find it fantastic that one can design and build an A380 from scratch, without prototyping, and have the damn thing fly first time, near perfectly. That while the first production models are nearing completion! Those who think that complex computer models are inherently unreliable should never step aboard one.

    Comment by GlenFergus — 15 Sep 2008 @ 2:54 AM

  302. Re #289:
    “As far as the Annan paper that you reference, it is interesting that the LGM evidence he uses has the same or higher probability for climate sensitivity of 1C as for 4.5C!”

    But what is the upper bound? The lower bound is more than 0, is the upper bound less than 5.5? If not, the uncertainties are likely to take us to higher values than lower values, despite the instantaneous value of probability.

    Also, the result (how expensive loss is under certain warming scenarios) we get is not linear but of a higher power than 1, so you should use that relationship to see which has the greatest cost likelihood.

    Comment by Mark — 15 Sep 2008 @ 3:46 AM

  303. Chris Dudley writes:

    The case that global warming deniers are crackpots or, often more likely, dishonest stooges, is not supported by calling peer reviewed science, such as that produced by Hoyle, crackpot.

    You don’t think it’s crackpot to believe that interstellar dust grains are bacteria, or that flu is delivered to the Earth from space? Sorry, I call ‘em as I see ‘em.

    Comment by Barton Paul Levenson — 15 Sep 2008 @ 3:47 AM

  304. Rod B, 292, again you miss answering.

    Why do you maintain that the uncertainties make the system safer not worse?

    You say in 292 you were saying merely that the paleo data was uncertain, but that uncertainty gives an answer from 2 to 4.5. Yet you maintain that the system must only edge on the 2 or lower side of this uncertainty.

    The errors are already taken into account, so saying “your models are no good because the input has huge errors” is irrelevant: the models are run with the inputs varied within those error bars.

    But answer the question asked: Why does this uncertainty only make the likely outcome of GW unimportant?

    Comment by Mark — 15 Sep 2008 @ 3:50 AM

  305. Rod writes:

    BPL, then doesn’t that make H2O the high temperature culprit (cause) with Venus, not CO2 as is often said?

    No, the large majority of the warming is still from CO2. Venus has about half Earth’s atmospheric water vapor. It has three million times as much carbon dioxide.

    Comment by Barton Paul Levenson — 15 Sep 2008 @ 3:50 AM

  306. Ron Taylor — yes, lowered diurnal temperature range is evidence for AGW.

    Comment by Barton Paul Levenson — 15 Sep 2008 @ 3:53 AM

  307. mugwump #287:

    RE Martin #283:

    The doubling equilibrium sensitivity is defined by convention not to include ice sheet albedo, for good reasons. It is indeed highly nonlinear to temperature.

    The paleo argument for doubling CO2 sensitivity is agnostic with respect to forcing source: you simply divide the temperature change by the sum of the forcings to get the expected temperature change per W/m2 of additional forcing. So in that sense it does include ice sheet albedo.

    Yes, precisely. You make separate attributions: part of the delta-T comes from ice sheet albedo, part from CO2 change (and part from Milankovich). You can do this because ice sheet area is (more or less) known.

    What I meant is that the albedo feedback caused by CO2 warming — part of the independently known ice sheet area change — is directly accounted for, and not attributed to CO2 when defining doubling sensitivity, as a matter of convention.

    Comment by Martin Vermeer — 15 Sep 2008 @ 5:17 AM

  308. Paul Middents #297:

    …and you don’t even bring up the interesting question why Mugs and friends are so preoccupied with the lower bound.

    As I see it, there are three interesting empirical values for doubling sensitivity that we’re trying to get a handle on: the most probable, the worst case, and the best case value. The first two are of obvious interest to responsible policy makers; the latter can only interest ostriches, not humans :-)

    (Actually, ostriches sticking their heads in the sand is urban legend.)

    Comment by Martin Vermeer — 15 Sep 2008 @ 5:38 AM

  309. Gavin, thank you for your reply. Is it your contention that the LGM is not a part of the “ice ages” that is referred to by Dr. Rind. He clearly speaks to the LGM in Rind 2008,

    Tropical climate sensitivity
    has been a big uncertainty from the paleoperspective
    ever since the CLIMAP (1981) production of the Last
    Glacial Maximum dataset. The Climate: Long-Range
    Investigation, Mapping, and Prediction’s (CLIMAP’s)
    significant finding was that the tropics were relatively
    invariant, with a cooling of some 1.5°C, compared
    with the high-latitude change of greater than 10°C
    (and in the vicinity of the continental ice sheets,
    much larger than that).

    Subsequent work by other
    modelers reached the same conclusion. Different
    ocean datasets (isotopes, alkenones, Mg/Ca ratios)
    now give somewhat different answers, but each has its
    own set of assumptions; no clear quantitative picture
    has emerged, there is just a general tendency to think
    that the land is somewhat less ambiguous, and so, the
    tropics probably had more significant cooling than
    was originally thought.

    I grant you in the abstract of Rind 1998, he states,

    It is estimated that low-latitude temperature gradients similar to today’s may have occurred in the Mesozoic and in the Little Ice Age; reduced gradients were more likely in the Pliocene, Eocene, Younger Dryas, and Last Glacial Maximum.

    However, the complete paper is not available-for free- so I am unsure how he arrived at this conclusion, or to be honest, exactly what it means. As to (Rind, Peteet 1985), what I take from the 1985 paper-

    The LGM model experiment, with specified sea-surface temperatures, was not in radiation balance; if the sea-surface temperatures had been allowed to change, they would have cooled by an additional 1.2C (the cooling effect of water vapor decrease
    and cloud cover change was mitigated by fixing the sea-surface temperatures; Hansen et al., 1984). This cannot be considered
    proof of a warm bias in the CLIMAP temperatures because of the uncertainty involved in modeling cloud cover; however, if the CO2 had been reduced to 240 ppm, as suggested by ice-core data
    (Shackleton et al., 1983), and the vegetation changed, the resulting climate (without specified sea-surface temperatures) would
    have been 5-6C colder than at present, There is a substantial difference between the climate sensitivity necessary to provide
    a 5-6C cooling and the 3.6C change which results from using the CLIMAP seasurface temperatures. As the questions of climate sensitivity and of the relative sensitivity of low and high latitudes are central to the estimation of the rapidity and size of
    future climate changes, the topic deserves especially close scrutiny.

    is that, over 20 years later Dr. Rind is asking the same questions.

    Comment by Ellis — 15 Sep 2008 @ 6:14 AM

  310. RE #297 Paul Middents:

    Mugs notes that he hasn’t had time yet to look into the LGM. Since then we’ve been treated to more handwaving.

    Rubbish. I’ve been looking into the LGM for the past two days. My main question is still outstanding: why is it believed that climate sensitivity to 2XCO2 is linear? Note: this is just a question. I am not saying that it should or should not be linear, but on face value linearity needs to be justified (as Hansen himself notes – see the first quote in #287).

    And if by “handwaving” you are referring to my snowball earth example, I knew at the time it was a risk with this crowd to include a hypothetical situation by way of illustration only. Such rhetorical devices help to clarify the point for those who are genuinely interested in the problem, but can easily be exploited for ridicule by those who are not.

    Until he can address in some substantive way, the evidence of the LGM, perhaps he might desist in his hectoring the grownups.

    All I am doing is reading the original papers and asking questions. That’s the way science works. Religion of course works differently: it requires unquestioning faith from its believers, who usually feel very threatened when outsiders start asking difficult questions.

    Comment by mugwump — 15 Sep 2008 @ 6:18 AM

  311. RE #297 Paul Middents:

    I thought it would be interesting to go to the source for the even handed estimation of climate sensitivity, James Annan. What should I find upon re-reading his 2006 blog entry but an exchange with our own mugwump that took place in the last few weeks.

    Mugs says his last entry, which was deleted, clarified his position.

    It certainly did. Which is why I don’t understand why it was deleted. Since you have linked to the original exchange, judge for yourself whether the last post was worthy of deletion:

    “But a relaxation time scale of a few months requires a thin mixed layer because a thick one takes longer to warm up or cool down.”

    Only if you suppose a large fraction of the flux change goes into the ocean rather than the atmosphere. If the flux to the ocean is small – < 10% of the peak flux according to DK – it doesn’t matter if it takes a long time for the ocean to cool down because more than 90% of the temperature response is in the atmosphere and that has a short time constant of a few months. You just won’t be able to see the ocean response in the data (it will be swamped by the atmosphere response + noise).

    OTOH, if the ocean flux is significant (more than half according to Wigley), then yes, you will see that in the data. But as DK point out: a long lag is just not there (or it is small enough to be swamped by the noise), so the data are telling you that either the ocean flux is small, or that the ocean flux is large but the mixing time is short (of the same order as the atmospheric mixing time). DK argue the former and provide a model supporting their number. You are arguing the latter, and that such a large, short ocean mixing is unphysical and therefore invalidates their whole approach.

    But DK’s point is 1) the ocean flux is actually small so any effect will not be measurable in the data, and 2) the data do not support a large lag; whence their paragraph [9]:

    “[9] WAST refer to ‘‘a more realistic model’’ [Wigley et al., 2005b] that yields even larger response times (values greater than 15 months). The data show smaller response times and should invalidate the applicability of that model.”

    Comment by mugwump — 15 Sep 2008 @ 6:30 AM

  312. RE #296 Ron Taylor:

    You are correct. The basic (no-feedback) sensitivity is nonlinear in T, but over the range we are talking about (a few degrees K), it is close to linear.

    However, the feedbacks are a function of the climate state, which is a highly nonlinear function of T (only a few degrees separates us from the last ice age – a very different climate), hence the contribution to climate sensitivity from the feedback processes may be very different today than it was in the LGM.

    Comment by mugwump — 15 Sep 2008 @ 6:58 AM

  313. Re #296 and #253, Ron Taylor and Barton. Thanks Barton for your reply, however, your response was not precisely to do with what I was talking about.

    I suspect there is no data dealing with the issue that I raised. My question is: for a given amb_ient night time temperature: what has been the trend for the the lowest achieviable temperatures due to the night sky radiation effect? That is not the same subject as the diurnal temperature range. Thanks…

    Comment by Lawrence McLean — 15 Sep 2008 @ 7:26 AM

  314. I just saw an interesting documentary on climate Sceptics. Michael Mann is a contributor:

    http://www.bbc.co.uk/iplayer/episode/b00dm7d5/b00dm7bf/

    (I’m not sure if the link works in all regions).

    Comment by Auste Redman — 15 Sep 2008 @ 7:34 AM

  315. Read this from science daily digested from a report from oregon university. Re: correlation between CO2 and temp over the past 70K years. Briefly..there is unequivocal correlation between cyclical fluctuations in CO2 concentrations on the atmosphere and resultant mirroring of temp in the northern and southern hemisphere. The level of CO2 now is twice that of that in the last ice age which triggered a 15C change in temp. Another point mentioned which partially answers my question on what affect changes to the north atlantic current has on world climate is that….the main driver of global CO2 and temp was the relative speed of the N.A current. As the current slows greenland and northern europe get colder while the southern hemisphere and antarctica gets appreciably hotter.
    So the most likely outcome of the rapid arctic melt should be a rapid slowing down of the N.A current with a resultant cooling of the temperate regions of the northern hemi whle the southern hemi cooks. As alluded to in the artical the historic climatic changes took affect within 20 years and that was with CO2 half the current levels..makes you think!!

    Comment by Lawrence Coleman — 15 Sep 2008 @ 8:20 AM

  316. the website for that report @ 315 is http://www.sciencedaily.com/releases/2008/09/080911150048.htm

    Comment by Lawrence Coleman — 15 Sep 2008 @ 8:24 AM

  317. Lawrence, #313.

    Uh, why is there a need for the difference? What if, because of GHG concentrations, there’s more dew-point moisture and therefore thicker water haze, causing more overcast skies even where there are no clouds per se?

    Your question would miss that.

    If it doesn’t happen, then what’s the difference between what you want and what actually happens as shown by measurements? As academic fluffery?

    Comment by Mark — 15 Sep 2008 @ 8:31 AM

  318. Mugwump, #312, but a non-linear change like you admit means that the possibility of a 4.5 degree change is more “effective” than the possibility of a 1 degree change.

    And a 1 degree change added to the current 1 degree over long term average before indistrialisation makes 2 degrees, which was enough to turn interglacial into glacial. What will such a change do to an interglacial? Even at 1 degree.

    Comment by Mark — 15 Sep 2008 @ 8:35 AM

  319. Dan Hughes,

    See “An introduction to three dimensional climate modeling”. The answer you’re looking for is in equation 3.31 (page 56) and repeated for emphasis as equation 3.53 (page 62). As you state/guess option (2) is used. The text describes reasonably carefully how this is constructed. There’s then a discussion of how this is transformed into a sigma co-ordinate (generally given a SIG monkier in modelE) on pages 63 through 68. I’d recommend NCAR CCM as being more readable and better documented, however it’s not too difficult to pick you way through the modelE stuff.

    If you’re interested in conditions describing where vertical hydrostatic balance is valid and invalid, this text refers me to J.R. Holton, “An Introduction to Dynamic Meteorology”, which it describes as having a discussion on this issue (which I’ve not read).

    Comment by Patrick Caldon — 15 Sep 2008 @ 9:03 AM

  320. Ron Taylor writes
    > my reading indicates that the GCMs predict

    Mugwump tells Ron
    > You are correct.

    Beware who you rely on for opinions.
    Mugwump’s got strong opinions but without citation, it’s not science.

    What reading, in each case, supports these statements?

    Comment by Hank Roberts — 15 Sep 2008 @ 10:14 AM

  321. Barton Paul Levenson #303,

    I think that the idea of viruses has certainly received serious consideration. Returning humans have been isolated in the past to protect against possible infection. Viruses can survive in space very well. And, the environment Hoyle and Wickramasinghe considered, a star forming cloud, certainly seems to be wet enough to support life. So, the idea has some firm foundations.

    I feel that it falls down on Occam’s razor: Given the extreme rapidity of viral evolution here on Earth and the efficiency of certain vectors such as birds, we don’t need a different global mechanism to spread viruses. That does not mean that the idea did not have sound components. Ideas can be wrong without being crackpot.

    I certainly recall the 3.4 um absorption profile published by Hoyle and Wickramasinghe reproduced in a review article not too long ago: Fig 9 here: http://ads.ari.uni-heidelberg.de/full/1994ApJ…437..683P That certainly would not happen if the idea were not taken seriously in an historical sense by the community. Presently, most people in the field associate the 3.4 um feature with the diffuse interstellar medium rather than with molecular clouds so that the mechanism proposed by Hoyle and Wickramasinghe to produce the feature would not seem to be available for that dust. I worry that ice mantles may mask the presence of the material responsible for the 3.4 um feature in molecular clouds so that associating the material only with the diffuse medium may not be correct. But the fact that UV irradiated ice residues seem to reproduce the main characteristics of the feature and such conditions pretty much have to arise at cloud edges makes me think that a better explanation for the feature is available.

    An umpire standing in another ballpark can call them as he sees them, but the calls won’t mean much since the balls and strikes would not really have been seen. I suggest that you want to look closely at the ideas and how they were presented and received at the time before passing judgement.

    Comment by Chris Dudley — 15 Sep 2008 @ 10:48 AM

  322. RE #311
    Mugs, commenting on a two year old blog, can’t understand why James Annan deleted his comment. Perhaps it was James’ subtle way of directing him to the caveat at the bottom of his “Empty Blog”.

    “COMMENTS ARE WELCOME, BUT I DON’T EXPECT (OR WISH) SUBSTANTIAL DEBATE HERE. PLEASE GO TO GLOBALCHANGE WHERE ALL CAN CONTRIBUTE.”

    When we go to GLOBALCHANGE we find a one month old thread, “Estimating equilibrium climate sensitivity”.

    This thread was conducted on a scientific level by professionals. Mugs might try out his theories and questions on that thread. I would advise checking the attitude and political bias at the door.

    I think advancing science might involve just a little more than reading the original papers and asking questions. Maybe more than two days might be required to absorb the full impact of LGM studies on our overall understanding of the climate. Concentration in a relevant field, gathering data, conducting original analysis and peer reviewed publication just might be essential for the advancement of science.

    Comment by Paul Middents — 15 Sep 2008 @ 10:59 AM

  323. So why not say you are now not sure of the greenhouse effect?

    If you cannot produce hard numbers then your still in the pre experimental phase. Not to say there is anything wrong with proposing this hypothesis, just that if you can’t make predictions you can’t make claim to scientific truth as regards to the cause of the recent warming and even more recent cooling.

    Comment by Mick — 15 Sep 2008 @ 11:27 AM

  324. re 290 Ike Solem – good point – bad example. The complexities of a 747 aren’t in the aerodynamics, but the control systems. Lift varies directly with wing area; with the same chord, wing section, takeoff weight, thrust, and air density(which varies a lot with temperature, altitude, and humidity), but eight feet less span, a 747 would require about 1.02 time the speed and 1.04 times the runway. Given that FAA braking distance regs for aborted takeoffs result in liftoff at about 80% of runway length, you could saw off about 20 feet of each wing and just barely(but not legally) lift off, as long as that clipping didn’t affect critical control systems(flaps, ailerons, hydraulics, sensors, fuel, etc). A much smaller percentage change in the wing section could have critical effect on the aerodynamics(duct tape a 1×4 along the length of the wing just aft of the slat on the upper surface, and it would never get off the ground) and would it require extensive modelling to predict behavior of seemingly subtle changes in camber, thickness, or form.

    see http://adg.stanford.edu/aa241/performance/takeoff.html, and http://www.mh-aerotools.de/airfoils/javafoil.htm

    Comment by Brian Dodge — 15 Sep 2008 @ 2:35 PM

  325. RE #322:

    Mugs, commenting on a two year old blog, can’t understand why James Annan deleted his comment. Perhaps it was James’ subtle way of directing him to the caveat at the bottom of his “Empty Blog”.

    Pauly pops apparently can’t understand why deliberately diminutive dismissives of commenter’s nom-de-plums are offensive. Nor can he understand why it is questionable for someone to delete a non-offensive remark that illustrates a significant error of interpretation at the end of a long technical discussion.

    Regardless, I am not interested in rehashing this. No more comments from me on the subject (unless you have a technical point to raise against my interpretation of the DK Pinatubo results).

    Comment by mugwump — 15 Sep 2008 @ 3:06 PM

  326. Mark #318:

    Mugwump, #312, but a non-linear change like you admit means that the possibility of a 4.5 degree change is more “effective” than the possibility of a 1 degree change.

    Not necessarily. It just means the climate response to an increased forcing today may not be the same as during the LGM, even after factoring out ice albedo effects.

    Comment by mugwump — 15 Sep 2008 @ 3:14 PM

  327. Re:320 from Hank

    “Ron Taylor writes
    > my reading indicates that the GCMs predict

    Mugwump tells Ron
    > You are correct.”

    Ron is referring to diurnal temperature range, while mugwump is referring to the non-linearity of sensitivity. There seems to be a mis-connect in the dialogue.

    Way back in an earlier post mugwump makes reference to Shakespeare in saying that proponents of AGW protest too much about contrarian opinions. I think the frustation and (sometimes) exasperation is more relevant to Banquo’s ghost. Some of the same points of denial resurface more than once even after they’ve been vanquished.

    Comment by Lawrence Brown — 15 Sep 2008 @ 3:25 PM

  328. RE #320 referring to #312 referring to #296:

    What reading, in each case, supports these statements?

    In my case Hank, try Principia Mathematica

    [the bit about fluxions]

    Comment by mugwump — 15 Sep 2008 @ 3:31 PM

  329. Mick, ever wonder how a CO2 laser works?
    Read anything under the Start Here or Science links?

    Lasers use the same radiation physics basis — how CO2 behaves — as climate models, which address a more complicated physical system.

    Don’t be fooled. You’re reposting someone’s talking point off some PR site. You can get better information.

    Comment by Hank Roberts — 15 Sep 2008 @ 4:07 PM

  330. Sorry to go completely off-topic here but I have a question of my own that I would love to get an answer for… (Hopefully a lot simpler!)

    Regarding the Greenland ice sheet and the estimates that it contains enough ice to raise sea levels by nearly 7 metres I need to find out how the researchers came to that figure – the best I can find is that it was put forward by Church et. al (2001) but can only access the abstract to that report which isnt much help.

    Basically I am engaged in one of those pointless internet forum debates where some genius believes that it is a fraudulent figure but I lack the understanding of physics and arithmitic to properly put him down.

    His argument goes something like this:

    “Your quoted figure is 2.85 m cubic Km for total ice associated with Greenland. I do not dispute that although it is only an estimation. It may well be more.

    The sea surface is approx 350 m sq km – This is a fact that I have not seen disputed and although I have seen figures up to 361 sq km I am working on approx figures only as that is all that is needed.

    Now the arithmetic.

    A 1 (one) metre rise in sea level requires 350 m cubic kilometres of water (or melted ice) That is 350 m sq by 1 metre deep.

    The Greenland ice you have quoted is only short by 347.15 m cubic kilometres.

    Please explain why this arithmetic is wrong.”

    If anyone can give me a quick rundown on why his assumptions are incorrect it would be a HUGE help. I cant let a denialist win an argument, not on the internet!

    :)

    [Response: Area of the ocean is 3.5 x 10^14 m2, amount of ice 2.85 m cubic km = 2.85 x 10^6 km3 = 2.85 x 10^15 m3, therefore SLR= 2.85*10/3.5 = 8.1 m by these figures. Your interlocutor got the 1 km = 10^3 m conversion wrong. - gavin]

    Comment by Connor — 15 Sep 2008 @ 4:41 PM

  331. #3 – Head North. SW USA will fry and dry. Florida and the Gulf Coast will drown. Alaska will do grand, especially since they will profit from selling the rest of the country the oil which will improve Alaska’s climate! There will be a period of transition where humans will actively replace the current (dying) ecosystem, which means cheap wood in Alaska. North Dakota and other northern tier states have plenty of wind for use in a post-fossil world. Nebraska sits on the mother-lode of water. You’ve got plenty of options; just avoid the traditional retirement states like the plague, and stay 100 metres above sea level.

    #4 – not sure I agree. So very much of the USA’s wealth is tied to locations close to sea level. I’ve always (OK, for the last 15 years) said that by 2020 the Arctic ocean would be ice-free in summer, and sea level will then destabilize. Once saltwater gets past Greenland’s sill as the glaciers retreat, tides will ravage the entire ice sheet. (Greenland’s glaciers flow up to the sea – at least at their bottom. They start perhaps 800ft below sea level!) Four metres per century (and accelerating) is reasonable.

    #24 –Asking for a huge delay between predictions and policy changes is simply a call for massive atmospheric change. You guarantee a single outcome, and one which has a tiny probability of a success. Truly a self-serving argument. Why do you ask whether outdated models worked? (They didn’t) Since the data is still there, it’s better to ask how well the current versions “predict” past climate when fed past data. The data stream should be as long as possible and the techniques should be brand new. Besides, all the models underestimate the the results of GHG injection into the atmosphere. I’d like to see a plot of the models’ expectations over time, as in 1980′s predictions using real data through 2008, then 1990′s etc etc. Noting the trajectory of the evolution of predictions probably gives a good feel for the real future. As the Arctic darkens and methane skyrockets, the models will have a lot of catching up to reality to do. The models are still tremendously flawed. They are far too conservative, as the effects of feedbacks can’t be realistically modelled until after the fact. As each “Oops” happens, the models will improve, but by then it is too late! Heck, the IPCC predicted sea level rise based on ZERO, yes ZERO ice melt. They didn’t know how severe it would be, so they guessed ZERO! That’s insanity driven by the naturally conservative nature of science as steered by the politics of the right. Unfortunately, the mindset which served science so well when it was a “It doesn’t matter except to academics” field breaks down when it is thrust into a political and highly relevant to everyone arena. Scientists are used to cinderblock rooms and tend to screw up when thrust in spotlights. Unfortunately, the reality is that models are irrelevant from a practical standpoint. Arctic ice is giving us the real-world experimental data far faster than any computer could calculate it. As the permafrost gets up to speed, the answer will be clear: “Oops, we’re screwed.” After that is obvious to everyone, then the models will be able to tell us what we will already know.

    #43 – The cost of mitigating GHGs is negative. Alternative reality: The USA increased CAFE to 60MPG, including light trucks back in 1980. Oil remained a glutted commodity and prices stayed around $10 a barrel. Rules for mor_gages for cars and houses were modified to include fuel costs, so buyers would qualify for the *slightly* larger mor_gage (Stupid SP_M filter) a more efficient system would require. This reduced both energy usage and the buyer’s monthly total cost of ownership, while driving down energy prices and preventing the massive problems caused by the “oil curse” in the Mideast and other regions. Again, a NEGATIVE cost. The easiest way to increase wealth is to drive down energy prices by increasing efficiency. All it would take is a few small changes to start the process. The mor_gage change I just posited would be a fine start. Saving energy is about REDUCING costs.

    Comment by RichardC — 15 Sep 2008 @ 4:57 PM

  332. How can i convince my friend that global warming is REAL and not just a rise in temperature?

    Comment by Sanafaye — 15 Sep 2008 @ 5:28 PM

  333. mugwump needs credit for dealing with his/her skepticism in the right way – reading the papers and getting answers to the questions. It seems to me that the ~2 value for climate sensitivity comes primarily from GCMs, fits observation records, fits paleo-record, but till we do the experiment (doubling CO2 which is preceding quite well), then you cant have certainty. This is science after all. Hope its lower and pray it isnt higher. However, there are sufficient lines of evidence pointing at 2 for this to be used as a basis for policy till better data or model come along. Its wildly irresponsible to assume that it is lower.

    Comment by Phil Scadden — 15 Sep 2008 @ 6:20 PM

  334. Been away, dealing with a niece who wouldn’t evacuate from Houston-Galveston area (her sister finally got her to leave)….like so many contrarians who say, well, what if there’s no GW….

    Here’s something about a (perhaps) underestimation of the aerosol effect: “Earth already committed to 2.4-dgree C rise from climate change [by 2030]” newstory based on Ramanathan & Feng’s study in PNAS Online. See: http://www.climateark.org/shared/reader/welcome.aspx?linkid=106630

    Comment by Lynn Vincentnathan — 15 Sep 2008 @ 7:31 PM

  335. RE #330:

    However, there are sufficient lines of evidence pointing at 2 for this to be used as a basis for policy till better data or model come along.

    We’re probably fine if it is 2. We’ve had 1/3 of that already. My assessment FWIW (which may not be much) is 2 or less we can probably ignore, 4 or more (on a short time scale of the next century or so) we’d better do something about. In between those two bounds it is unclear which direction policy should go.

    I don’t think we should worry about effects more than 150 years hence, unless we are really really certain they are going to be very bad. By then technology will be insanely more advanced than it is now, and we’ll probably be able to just suck the CO2 out of the atmosphere if we need to. As justification, compare the technology of today with that of 150 years ago, and then consider that technological development is still on an exponential curve and is likely to be so for at least another few hundred years.

    Comment by mugwump — 15 Sep 2008 @ 7:53 PM

  336. Phil, what’s the source for your statement?

    > it seems to me that the ~2 value for climate
    > sensitivity comes primarily from GCMs

    I think you likely have a basis for this, from seeing other things you’ve written, but I don’t know where.

    Comment by Hank Roberts — 15 Sep 2008 @ 8:16 PM

  337. Mark: “But what is the upper bound? The lower bound is more than 0, is the upper bound less than 5.5?”

    Did you read Annan’s paper? The probability from LGM data is higher at 0 than at 5.5.

    The overall conclusion from all the data sets an upper bound of 4.5C.

    Comment by Steve Reynolds — 15 Sep 2008 @ 9:03 PM

  338. Mark, you have me confused with someone else, or we keep passing in the night. I never said anything about the “…uncertainties make the system safer not worse…” All I asserted was that the uncertainties of the data make the conclusions less certain.

    Comment by Rod B — 15 Sep 2008 @ 10:31 PM

  339. BPL, wait a minute. Water vapor made Venus so hot that CO2 could not be sequestered by the normal weathering process (which also requires H2O) and so then CO2 built up, but H2O is not the primary culprit??

    Comment by Rod B — 15 Sep 2008 @ 10:37 PM

  340. Rod, seriously, read some of the Venus papers, Google Scholar would help a lot at this point. You need to follow the description through from the very early planetary atmospheres for, say, Mars, Venus, and Earth. All three were rich in hydrogen; Mars and Venus lost most of theirs, over a long stretch of time, for rather different reasons, but in both cases they couldn’t form and hang on to a whole lot of water as Earth did.

    This happened for different reasons — location, location, location, as they say in planetary real estate — there’s a habitable zone around a sun — not too hot, not too cold, more or less survivable.

    Comment by Hank Roberts — 15 Sep 2008 @ 11:03 PM

  341. Brian Dodge, have you run your idea past Boeing? You claim that you can do a back-of-the-envelope calculation and come up with an accurate answer on that?

    http://www.financialexpress.com/news/Boeing-to-test-Tata-supercomputer-in-India/298008/

    Supercomputers are integral to modern airplane design – it couldn’t be done without them. The same goes for climate predictions.

    In what Boeing describes as the world’s biggest and most sophisticated industrialized project ever, the 787 program has taken more than a million hours of supercomputer design work involving hundreds of aerospace workers from every corner of the globe.

    http://www.msnbc.msn.com/id/19421415/

    More to the point: grid size is also important, as in climate models:
    aero-comlab.stanford.edu/fatica/papers/jameson_fatica_hpc.pdf

    In external aerodynamics most of the flows to be simulated are steady, at least at the macroscopic scale. Computational costs vary drastically with the choice of mathematical model. Studies of the dependency of the result on mesh refinement, performed by this author and others, have demonstrated that inviscid transonic potential flow or Euler solutions for an airfoil can be accurately calculated on a mesh with 160 cells around the section, and 32 cells normal to the section.

    It is a very similar situation to the climate model – we know that shortening the wing will reduce lift, and we know that adding infrared-absorbing gases to the atmosphere will raise the surface temperature, but for quantitative estimates, supercomputer models are an absolute necessity in both cases.

    Comment by Ike Solem — 16 Sep 2008 @ 12:40 AM

  342. #337 – The bounds of CO2 direct forcing is irrelevant. Doubling CO2 can’t be done in a vacuum, with no CH4 change and no ice sheet (albedo) change. The whole debate seems similar to a focus on the explosive power of a fuse, while ignoring that a nuclear bomb is connected to the fuse. It is senseless and immaterial at best. Yep, CO2 is so wimpy that doubling its concentration in the atmosphere would lead to minor changes in global temperatures. However, ignoring the huge changes in temperature which would be unleashed by those minor changes is not rational. Excluding feedbacks from the equation is ludicrous and irresponsible. Scientists might understand that the caveats exist, but POLICY is being made based on the statements. Scientists need to learn to speak HUMAN. Insisting on retaining the arcane speech patterns of traditional science is a recipe for disaster. “Oh yes, if doubling CO2 results in a ten-fold increase in CH4, then ……..” is something that needs to be brought up first, not excluded. The public expects that all variables will be included in the initial statement, so even though it tears holes in your scientific heart, you must include all variables. Welcome to the spotlight.

    Comment by RichardC — 16 Sep 2008 @ 12:50 AM

  343. “By then technology will be insanely more advanced than it is now, and we’ll probably be able to just suck the CO2 out of the atmosphere if we need to.” – mugwump.

    Yes – and we’ll all live twice as long as we do now, and aliens will have landed and given us even more technology, and we’ll be flying to the stars on fusion reactors… nice pipe dream.

    Technological wish-fulfillment is not always possible… for example, in the 1950s many leading scientists believed that accurate weather control was just around the corner – they actually had an entire military program devoted to the concept, courtesy of John Von Neumann.

    As a committed opponent of Communism and a key member of the WWII-era national security establishment, von Neumann hoped that weather modeling might lead to weather control, which might be used as a weapon of war. Soviet harvests, for example, might be ruined by a US-induced drought.”

    http://swiki.cs.colorado.edu:3232/phd-intro-2007/uploads/34/Grcar_talk_03.4.pdf

    Ed Lorenz showed that long-term weather forecasts were impossible, putting an end to that concept.

    That’s why appeals to futuristic technology are not to be trusted.

    You might just say what I think you really mean: “I don’t care what kind of world tomorrow’s children inherit, because I’ll be dead by then, and I’m certainly not going to make any sacrifices on their behalf.”

    Comment by Ike Solem — 16 Sep 2008 @ 1:04 AM

  344. Rod 338, you do however say “what if it’s less than 2?” and similar.

    Comment by Mark — 16 Sep 2008 @ 3:26 AM

  345. Steve 337, I really don’t think that the chance of CO2 sensitivity is anything other than 0.0 recurring at 0.

    Comment by Mark — 16 Sep 2008 @ 3:29 AM

  346. Chris Dudley writes:

    I suggest that you want to look closely at the ideas and how they were presented and received at the time before passing judgement.

    I suggest that I already have and I stand by what I said. No matter how you argue for them, believing that the flu comes from space and that interstellar dust grains are bacteria are crackpot ideas. Period.

    Has it occurred to you that the number of possible biochemicals is astronomical? How is it that all that flu forming in space just happens to have DNA or RNA for its genetic material, allowing it to infect human beings? And how does it survive raw ultraviolet radiation from the sun?

    And are you aware that space is filled with ionizing radiation? There sure are complex chemicals in space, especially in nebulae, but if dust grains are bacteria, what the hell do they eat? The interstellar medium in a nebula isn’t much more than a hard vacuum, you know. In real life, if you look at a bacterial colony, about 25% of the bacteria are reproducing by fission at any given time, because their life spans are very, very short. Doesn’t happen to interstellar dust grains.

    Your defense of Hoyle argues an irrational and emotional a priori attachment to his ideas that cannot be swayed by evidence or logic. So, if you don’t mind, I’m going to stop discussing this asinine controversy. You can go on if you want, of course.

    Comment by Barton Paul Levenson — 16 Sep 2008 @ 3:52 AM

  347. Rod B writes:

    BPL, wait a minute. Water vapor made Venus so hot that CO2 could not be sequestered by the normal weathering process (which also requires H2O) and so then CO2 built up, but H2O is not the primary culprit??

    When Venus had 300 bars of steam in its atmosphere water vapor was the major greenhouse gas there. Now that it has 89 bars of carbon dioxide and little else, CO2 is.

    Comment by Barton Paul Levenson — 16 Sep 2008 @ 3:58 AM

  348. Phil Scadden, A value of 2 for climate sensitivity is a lower limit. The most probable value remains 3. However, you need to consider the asymmetry of the probability distribution for sensitivity values. 2 and 4.5 constitute lower and upper bounds at the 90% CL. The question then becomes whether we want to bet the future of civilization on a 90% CL. If not, the asymmetry becomes a serious issue.
    Moreover, I would contend that Mugwump’s sanguinity about a warming of even 2 degrees may be misplaced. We don’t know when significant emissions of CO2 and CH4 from natural sources will kick in. True skeptics need to understand that uncertainty is not their friend here. If we cannot limit sensitivity, if we cannot better understand the onsets of natural tipping points, and if we cannot trust the models to guide our efforts, the only responsible course of action is to impose stringent limits even at the cost of economic health. The models and the climate science community are your best friends if you seek rational policies and healthy economies.

    Oracle of ReCAPTCHA: 162nd prayer

    Comment by Ray Ladbury — 16 Sep 2008 @ 7:31 AM

  349. RE #343 Ike:

    Technological wish-fulfillment is not always possible… for example, in the 1950s many leading scientists believed that accurate weather control was just around the corner – they actually had an entire military program devoted to the concept, courtesy of John Von Neumann.

    That’s a great deal more far-fetched than large-scale CO2 sequestration.

    You might just say what I think you really mean: “I don’t care what kind of world tomorrow’s children inherit, because I’ll be dead by then, and I’m certainly not going to make any sacrifices on their behalf.”

    I have young children. You think I don’t already make huge sacrifices for them? As I tell my wife, we are now genetically irrelevant. Our only biological purpose is to transfer our remaining life force to our children.

    Their living standard is vastly greater than mine was as a child. Mine was vastly greater than my parents, which was vastly greater than my grandparents. This is due in large part to virtually unfettered growth built on very cheap energy for the past 150 years.

    I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.

    Comment by mugwump — 16 Sep 2008 @ 8:19 AM

  350. Mugwump says: “I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.”

    Look, I don’t see much point in descending into a debate over who cares about their children more. However, I think you have to consider the implications of the uncertainty. You seem to be assuming that the sensitivity will shake out at 2 degrees per doubling rather than 3 or 4.5. This is not an evidence-based decision. Indeed, it’s far more likely that the sensitivity could be greater than 5.5 than below 1.5. Moreover, we know that at some point, natural sources of ghgs will swamp our own emissions. We don’t know how close we are to that tipping point.
    Uncertainty is not your friend and it is definitely not the friend of your progeny. We are at a turning point wrt energy. All climate concerns do is tell us that we should look to other sources than coal to meet future needs–and petroleum is becoming to precious to burn.

    Comment by Ray Ladbury — 16 Sep 2008 @ 9:02 AM

  351. GlenFergus (301), are you stating that the A380 was built and flown without any wind tunnel and similar tests on the equivalent shape and weight? I find that hard to believe, but don’t really know.

    Et al, since the supercomputer-run aerodynamic models have been thoroughly vetted and tested over and over again with millions of hours of physical lab type processes, which climate models haven’t and can’t, how can aerodynamic models be used as evidence per se of the validity of GCMs? Ala the argument that tells us if we don’t accept GCMs hook, line, and sinker, we’d better not ever board a plane.

    Comment by Rod B — 16 Sep 2008 @ 9:16 AM

  352. Ray #350:

    You seem to be assuming that the sensitivity will shake out at 2 degrees per doubling rather than 3 or 4.5.

    That’s my guess, but not an assumption. All I am saying is we should get better bounds on the sensitivity before making drastic changes to our economy. If sensitivity really is 4.5C and fast mixing (so we’ll be there by 2100), we should probably be doing a heck of a lot more than making mild cuts to CO2 in the west.

    But let’s be realistic. China and India are looking at catapulting their children from abject poverty to middle-class in a single generation. It’s very unlikely they’d be willing to jeopardize that for even a 4.5C rise, since the benefit to them far outweighs the cost.

    Indeed, it’s far more likely that the sensitivity could be greater than 5.5 than below 1.5.

    I doubt that.

    Eventually you do run into lower bounds that are hard – eg no-feedback sensitivity of 1C, or 0C sensitivity because of negative feedbacks, if you believe in the power of clouds to stabilize the climate.

    There are no similar hard physical upper bounds, so while it may be strictly correct, it is not useful to claim some high sensitivity is much more likely than a low sensitivity near the hard physical limits.

    Comment by mugwump — 16 Sep 2008 @ 9:28 AM

  353. Mark (344), no, sorry, but you have me confused with someone else; I didn’t say that.

    Comment by Rod B — 16 Sep 2008 @ 9:32 AM

  354. Barton Paul Levenson #346,

    I think that you are having trouble reading what I am saying. I am not defending those ideas as correct. I am saying they are not crackpot as
    they have some sound elements and were taken seriously at the time they
    were presented. You, who style yourself a wordsmith, should be thankful
    for the correction.

    Again, I do not say that all of Hoyle’s ideas were correct and I’m not
    attached to them. My advisor’s advisor was Ryle. But, I am interested in
    the proper use of the term crackpot because I think that there are ways to
    save time when it is used correctly. AGW deniers tend to be crackpots or
    stooges and in their case it may be possible to simply label them and ignore
    them going forward.

    I am sorry you are upset. Rereading what I wrote may help to calm you if you do it more carefully.

    Comment by Chris Dudley — 16 Sep 2008 @ 9:51 AM

  355. Ray: “Uncertainty is not your friend and it is definitely not the friend of your progeny.”

    For once I agree, and I think the amount of uncertainty has been greatly exaggerated. For example, your discussion of 5.5C sensitivity. Here is what James Annan says about the upper limit given in his paper:

    “As for the upper limit of 4.5C – as should be clear from the paper, I’m not really happy in assigning as high a value as 5% to the probability of exceeding that value. But there’s a limit to what we could realistically expect to get past the referees, at least without a lengthy battle.”

    Comment by Steve Reynolds — 16 Sep 2008 @ 10:02 AM

  356. #352 — ” All I am saying is we should get better bounds on the sensitivity before making drastic changes to our economy” That makes no sense. Why not save money and increase human prosperity by changing the economy? That it also would mitigate climate change can be considered a bonus. 100% of all near-term changes proposed will save money. Efficient lighting, insulation, efficient cars. You’re saying on one hand that we MUST retain 1900s based inefficient technology, then on the other hand say that we need not be concerned for the future because Buck Rodgers style tech will save the day. Which is it? Do you want to keep the inefficient 1900s stuff you currently drive or go for the future?

    Then you say that hard lower bounds exist, yet no hard upper bound exist. Yep, so far so good. Your conclusion is ludicrous, that giving little weight to the unlimited upper bound is the best course! The essential point is that one can’t do only one thing. Doubling CO2 was done many decades ago. That CO2 just hasn’t entered the atmosphere yet. Remember, CO2 lags thermal by perhaps 800 years, so you CAN’T measure CO2 this year and call it a conclusion. The real equation is CO2 –> temp rise —> CH4/CO2 rise (repeat until decay of CH4 into CO2 causes equilibrium)

    Anyone else noticed that oil is still a glutted commodity and always will be? There’s enough tar sands and shale to produce oil at $21 a barrel for hundreds of years. OPEC is having to restrict output again to prop up prices. Energy cost has nothing to do with supply and demand. It is oligopoly, a game played with atmosphere and people, both consumed for profit. (is this too political a comment for this board? If so, let me know and I’ll adjust)

    Comment by RichardC — 16 Sep 2008 @ 10:09 AM

  357. Re: 352

    As long as we’re being realistic, let’s note what science has to say about a 4.5C rise, and if that’s intolerable, then we need to undertake measures to avoid it. Prudence would dictate that we build in a margin of safety as well.

    It’s the only responsible course if we want the best for everyone’s progeny.

    Comment by walter pearce — 16 Sep 2008 @ 10:13 AM

  358. Mugwump: “There are no similar hard physical upper bounds, so while it may be strictly correct, it is not useful to claim some high sensitivity is much more likely than a low sensitivity near the hard physical limits.”

    I only know what the data allow me to say. They do not allow me to preclude high sensitivity, and that suggests caution in making big changes to our environment as well as to our economy.

    I have traveled in both China and India. I’ve also traveled extensively in Brazil and lived in Africa. You would probably be surprised to know how well people in these countries understand the situation. It is true they want their share. However, many are educated and realize that climate change poses serious risks. I remember very well a conversation with an old man who had planted a mango orchard. Not only was he aware the climate had changed, he was aware of the role of greenhouse gasses–this from a man with little formal education in a remote area of West Africa! If citizens of developing nations were presented with choices that allowed them to continue growing (even at a somwhat slower rate) without risking the environment.
    We in developed nations have to realize that development will occur. In the developing world they have to realize that climate change threatens their continued progress just as surely as any other threat.

    Comment by Ray Ladbury — 16 Sep 2008 @ 10:22 AM

  359. Mugwump, misrepresenting the science is doing a disservice to the future, and that is what you are doing. By refusing to take action now, you are helping to ensure that tomorrow’s children will have a very difficult time of it. I’m sure you love you children very much, but if you think you can sequester wealth away while the rest of the world burns, and so protect your children’s future that way – well, that’s the psychology of money for you. There are plenty of things that money can’t buy, after all.

    However, let’s stick to the science and consider your claims about “large-scale CO2 sequestration.” That’s already been covered at RC in some detail, for example here:

    http://www.realclimate.org/index.php/archives/2008/05/freeman-dysons-selective-vision/

    More specifically, look at the arithmetic:
    http://www.realclimate.org/index.php/archives/2008/05/freeman-dysons-selective-vision/#comment-88599
    And continued:

    # Petro Says: 30 May 2008 at 1:50 PM

    Just to continue arithmetics started by Ike at 243
    “you are at a realistic estimate of 100 trees per person per year.”

    Assuming a tree need an area of 10m2 for growing, 100*6,5 billion trees a year would need an area of 6500 billion m2 or 6,5 million km2. The land surface area of Earth is 148940000 km². Earth would be full forest in 23 years. In 79 years also the oceans would be filled with trees.

    Well, at least it would be hard to drive a car in that forest.

    Thus, the notion that carbon sequestration technology can be used to offset future emissions before the climatic effects of those emissions kick in is nonsense, practically speaking.

    It is an energy-entropy problem. After all, there is a lot of gold in seawater – why hasn’t someone figured out a cheap way to extract it? Reducing the disorder of a system takes energy. Where will that energy come from? How many cubic meters of air need to be processed to recover one ton of carbon?

    Burning one billion tons of carbon produces 3.67 billion tons of C02.

    “The air currently contains 0.036% CO2, so one cubic meter of air is 1,000 liters so roughly 1000×0.00036=0.36 liters, divided by 22.4liters per mole=0.000159 moles x 44 grams/mole=~0.007grams per cubic meter.”

    7 milligrams of CO2 per cubic meter… and we are injecting, globally, around 27 billion tons of CO2 per year. Let’s see – we would need some device capable of sucking a good fraction of the planet’s atmosphere through itself each year, and pulling out a significant fraction of the carbon dioxide.

    We already have such a system, but it is maxed out – the biosphere – and is not capable of sequestering the excess (and it is more of a steady-state flowthrough system, not a sequestration system).

    There is only one solution, and that is to voluntarily halt the use of fossil fuels, starting with coal, tar sands and shale oil. The stone age didn’t come to an end due to a lack of stones, did it?

    Comment by Ike Solem — 16 Sep 2008 @ 11:04 AM

  360. It’s very unlikely they’d be willing to jeopardize that for even a 4.5C rise, since the benefit to them far outweighs the cost.

    So state that with a certain absolutism. I assume you’ve calculated the relative costs of both possibilities and are willing to show your work?

    Comment by dhogaza — 16 Sep 2008 @ 11:27 AM

  361. > All I am saying is we should get better bounds

    Delay is the deadliest form of denial. — C.N. Parkinson

    > before making drastic changes to our economy

    Finally, the political basis for your persistent comments.

    Yes, that would have been wise. The drastic change, burning large amounts of coal, should be stopped until we have a better idea of the consequences.

    You don’t separate ecology and economy, do you? It’s the same thing.

    > But let’s be realistic.

    You mean like with actual reality, or with PR?
    Without good science you have only ideology and public relations, and the disasters those lead to. Nature isn’t “realistic” — it’s real.
    We don’t understand it, but we know we can screw up badly.

    > poverty to middle-class

    Emulating America? Not smart:
    http://www.quotedb.com/quotes/2688

    So look, you’ve finally stated your issue. Done yet?

    Comment by Hank Roberts — 16 Sep 2008 @ 11:42 AM

  362. mugwump: you are aware that growth can’t continue indefinitely, right?

    I mean, at least as long as the number of people on the planet goes up, you just can’t treat resources as infinite.

    Here’s an example. I did a little back of the envelope on oil supplies. I assumed that oil magically came out of the ground at whatever rate you wanted, ready for use. I further assumed a 1% increase in usage per year, leaving aside the effects on climate. The last assumption was that the whole earth was covered with a 1km thick layer of oil.

    How long does that last? until about 2100. Maybe a little longer – say 2120 if you cut usage a bit over time.

    I checked the numbers and found that the amount I had was remarkably close to the Energy Information Administration, with an important difference: They assume about half of that is recoverable. So using their numbers and making very generous assumptions about increasing usage, we get to what? 2070?

    You’re a physics guy, mugwump. You know that the reason we use oil is because nothing else provides the energy density it does. Nuclear? Yes, that’s good for electricity, but not for driving cars. Solar and wind and tides provide decent point sources, but they are going to cost a bit more for a while.

    We can’t assume that some new technology will save us in the future. And we certainly can’t develop that technology in w world with no petroleum, because we need it to make plastic among other things.

    Again I draw the analogy with smoking. You could keep on doing it and hope a new technology comes along to save you, or you could stop smoking and avoid the problem of lung cancer.

    With economies, you can do nothing, and hope the sensitivities aren’t as bad as we think, and that markets will develop the technologies to save us just in time. Might happen, might not.

    Or, you can take a hit to growth now, because that will preserve the standard of living for your kids. Rather like saving money — you could spend everything and think “I have skills, I’ll always find something” or you could not. Which is better? Remember, you lose immediate use of income when you save, so you don’t get to build a bigger house or buy a bigger car or maybe send your kid to the most expensive college.

    Now it’s possible that the precautions you take were unnecessary. But they might not be. And the stakes are high enough that we can’t treat them as idle speculation or “alarmist.”

    Back in 2001 Scientific American published a piece about New Orleans. It described what would happen in a category 3-4 storm. the study it was based on was ignored. “Why spend all the money on an event that isn’t likely in the next few years” they said. That turned out well, right? I mean, the markets took care of everything, right? Or not.

    And it isn’t like hurricanes in the Southeast are a surprise. You know they are coming, but in the name of growth Florida has continued to encourage people to build on the beaches. Am I the only one between the two of us for whom that’s nonsensical?

    I come from a place where growth is pretty sluggish. But the town of Scituate is still standing. So is Hull, and Nahant. Those houses are built rather differently from those in Florida. We knew hurricanes would come, not this year, maybe not the next, but we knew they would come. We built accordingly. Who has more to leave to their children? But I’m from New England, and we’re weird that way, I guess.

    Comment by Jess — 16 Sep 2008 @ 12:13 PM

  363. why, mugwump (352)?

    Why should we WAIT for more change to see whether we’ll be at the top end of expectation or lower end of expectation? If we do something NOW based on “worst case” then we will not have to change much to reverse or halt the changes we wish to avoid. Ergo, the expense will be heavy in the short term and nil in the long term.

    If we wait, then we not only have to do more change in a shorter time, but we will have to put up with the changes we had to let take place to see the effects.

    So why?

    Comment by Mark — 16 Sep 2008 @ 12:34 PM

  364. A bit of topic but I have another question with no simple answer on my blog abut a “sceptic” paper.

    Comment by Magnus Westerstrand — 16 Sep 2008 @ 1:08 PM

  365. mugwump wrote: “I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.”

    Unmitigated anthropogenic global warming will “halt growth” and it will “halt growth” permanently on any time scale that matters to the human species.

    On the other hand, phasing out costly, destructive, increasingly scarce fossil fuels and moving to an energy economy based on harvesting clean, abundant, ubiquitous, endless, free wind and solar energy will drive the economic growth of the New Industrial Revolution of the 21st century.

    Your children’s future economic well-being depends on a rapid transition from fossil fuels to clean, renewable energy — which just so happens to be the transition that’s needed to address global warming.

    Comment by SecularAnimist — 16 Sep 2008 @ 1:19 PM

  366. From comment 343:

    “By then technology will be insanely more advanced than it is now, and we’ll probably be able to just suck the CO2 out of the atmosphere if we need to.” – mugwump.

    Yes – and we’ll all live twice as long as we do now, and aliens will have landed and given us even more technology …

    Sucking CO2 out of the atmosphere is possible now with sanely unadvanced terrestrial technology.

    Comment by G.R.L. Cowan, H2 energy fan 'til ~1996 — 16 Sep 2008 @ 3:06 PM

  367. RE sensitivity, it may not be wise to analytically take (and keep) this out of context of important variables, such as long term positive feedbacks. Skeptics so gleefully point out how warming preceeds CO2, and I believe ‘em — warming causes nature to release carbon, which then causes all the more warming.

    It seems the warming situation is not a simple linear function of OUR carbon emissions with a sensitivity constant (tho I understand sensitivity is not actually a constant, but a log function effect that has it decreasing a bit with more CO2). It seems rather that we’re looking at something more like a quadratic or exponential or catastrophe (see http://en.wikipedia.org/wiki/Catastrophe_theory ) or geometrical progression type of function with OUR carbon emissions.

    That is, we warm the world a bit with our carbon emissions, then nature takes over both in reducing albedo and increasing releases of carbon from melting ocean hydrates and permafrost (which recent studies say we have grossly underestimated), and the further warming from that further increased factors that cause more warming. In other words, carbon is itself both a causal and effect variable, sort of like an infinite do-loop or a hall of mirrors, though with upper hysteresis bounds, like after the end-Permian when 90% of life died, the extreme warming eventually halted and got back down to life-friendly temps, or we wouldn’t be here, but watch out for the hydrogen sulfide outgassing and massive death meanwhile (another variable we’ve got to watch, tho perhaps not for 100s or 1000s of years).

    Sort of like those fission demonstrations — someone throws a ping-pong ball (our emissions) into a room full of mouse-traps (warming) with ping-pong balls (nature’s carbon emissions), and they all get released rather quickly helter skelter, even though the spring tension (sensitivity) is constant, and even if it’s a bit weak (low).

    A lower sensitivity would only delay a bit reaching the runaway tipping point of no return (for 1000s or 100,000s of years anyway). Assuming, of course, we haven’t already crossed that threshold.

    Comment by Lynn Vincentnathan — 16 Sep 2008 @ 3:20 PM

  368. Re #353. Yes well we are hardly likely to do much anyway what with the need to replace 800 million internal combustion vehicles for starters within 40 years which is nigh in impossible especially considering we have at present no replacement technology for liquid fuels.

    Fortunately for electricity production we can replace some gas and coal usage with sustainable sources but by no means all, well not politically or economically anyway, the will is not there presently.

    So lets hope your guess is right mugwump about 2C for 450 or was it 550 ppmv CO2e?

    Comment by pete best — 16 Sep 2008 @ 3:39 PM

  369. A lower limit of say 2C on the overall average Earth temperature doesn’t take into account that some regions will warm quite a bit more. The polar regions have already warmed(about twice as fast thus far) faster than other regions and will continue to do so, due in part to loss of reflective sea ice. Furthermore there is more to global warming than temperature change. Melting ice will spill into the more saline oceans,diluting them, and potentially affect ocean circulation patterns. Melting land ice will raise sea level. Warmer temperatures could lead the formation of higher intensity storms. Flora and fauna will be affected by dislocation and extinction .These are a few of the considerations other than a change in Earth’s average temperature.

    mugwump says in in comment #352:
    “That’s my guess(2C), but not an assumption. All I am saying is we should get better bounds on the sensitivity before making drastic changes to our economy.”

    This is a short sited view,in my opinion, and doesn’t take into account the potential catastrophic events that are possible,and even probable, at the lower limit of average change.

    Comment by Lawrence Brown — 16 Sep 2008 @ 6:14 PM

  370. Ike #356:

    Who says we have to use trees to sequester? Think exponential growth in technology: eg, imagine billions of tiny nano-sequesterers released into the atmosphere that fall as inert soot when they are done.

    Once we really get going on molecular and nano machines, existing biology is going to look as limited as the abacus does next to my quad-core pentium.

    Comment by mugwump — 16 Sep 2008 @ 7:50 PM

  371. Re Jess #359:

    mugwump: you are aware that growth can’t continue indefinitely, right?

    Sure it can. A lot of the growth in Western countries today comes from technological improvements. There’s no limit to technology.

    Comment by mugwump — 16 Sep 2008 @ 7:53 PM

  372. RE #359:

    We can’t assume that some new technology will save us in the future. And we certainly can’t develop that technology in w world with no petroleum, because we need it to make plastic among other things.

    You underestimate the power of markets Jess. Look what happened in the US when gas (finally) reached the point of price elasticity. SUV sales fell off a cliff. The market achieved in a matter of months what decades of hectoring by environmentalists failed to do.

    If plastic turns out to be too expensive to manufacture once petroleum runs low, investment in alternatives will surge, and we’ll very likely find adequate replacements.

    [BTW, I took advantage of the temporary glut in inventory to pick myself up a very nice, brand new SUV at a bargain basement price, and now gas prices are falling again :-) ]

    Comment by mugwump — 16 Sep 2008 @ 8:02 PM

  373. RE #364:

    The polar regions have already warmed(about twice as fast thus far)

    Antarctica has warmed very little, if at all.

    Comment by mugwump — 16 Sep 2008 @ 8:34 PM

  374. RE #363 & “Yes well we are hardly likely to do much anyway what with the need to replace 800 million internal combustion vehicles for starters within 40 years which is nigh in impossible.”

    It’s doable, since ICE vehicles can be converted into EVs. Regular people can even do the conversions themselves. I wish I had had time to do a conversion when I lived closer to an EV club, and now I’m far away….the closest is Houston or Austin (see http://www.eaaev.org/ to find a chapter near you). When I asked a guy from the Fox Valley EV Assoc if a woman could make a conversion, he said there was a guy there who had never held a screwdriver in his life who made a conversion. That the motor is sort of like a sewing machine motor — I could ID with that.

    So, what you do is either get a used car whose engine has blown, or sell the engine….then do the conversion. It won’t have regenerative braking of other nice stuff that a company-built EV has, but it can get people from point A to point B very cheaply, since maintenance and electricity is far cheaper than for ICE cars. I believe the typical conversion has a range of about 20 to 30 miles, but if you add more batteries (I’m talking lead acid), you can get a better range.

    And they say that even if fossil fuels are used to create electricity for the EV, it would still be much less polluting, involving much less GHGs. And for me, with wind powered electricity, I’d be driving on the wind.

    Maybe if people spent less time in front of a computer, and more time making EV conversions, the problem would be half solved.

    Comment by Lynn Vincentnathan — 16 Sep 2008 @ 8:52 PM

  375. Mugwump,

    You seem to be discounting the risks of inaction for some reason. As far as we can tell the chance of a large temperature rise under business as usual is much greater than the chance of one that we can ignore. The costs of inaction if we do have a large temperature rise are greater than the costs of unnecessary action if we have a smaller temperature rise. Just think of the costs of relocating cities. Given what we know the prudent thing to do is to take action to mitigate the climate change. It is simply a matter of decision theory. And I mean effective actions not displays of concern. You see to have looked at the costs of mitigation and recoiled saying “Too much”. You then seem to have not allowed yourself to react to the cost of inaction if action is necessary. Unfortunately we don’t have any good alternatives.

    If you want to wait for more certainty, well certainty will be expensive. This is a cumulative problem and generally the sooner we act the smaller the total cost will be. There are some exceptions mostly when a more effective mitigating technology is nearly but not quite ready. In these cases it might be best to wait a short while. But I am only talking about technologies that will be ready very soon as in the next few decades. I’m primarily thinking of biomass for fuel production and solar electricity.

    In general we cannot count on the necessary technology being ready when we need it. After all we have been working on thermonuclear fusion for more than 50 years. We cannot count on the new technology we require being ready in a hundred or even a hundred and fifty years.

    However in the very long term I do think we can count on the necessary technology to reverse global warming effects. In three hundred years I would be very surprised if we could not deal with these problems. But even then there will have been damage that cannot be reversed. I think we should focus on the consequences expected in the next 50 to 300 years. Inaction is gambler’s behaviour, focusing only on the best possible outcome even though that outcome is unlikely.

    Comment by Lloyd Flack — 16 Sep 2008 @ 8:59 PM

  376. > Phil, what’s the source for your statement?

    Sorry Hank and Ray. I was being careless. Number on table was ~2. Nothing of what I have read (here, IPCC) gives me any confidence that sensitivity is less than 2. Colleagues working on marine methane hydrates scare the hell of me with possibility of sensitivity being a great deal higher. I wasnt being clear – I was really arguing against the idea that uncertainty precluded action. Nothing is completely certain in science but we proceed anyway. It was here that I learnt sensitivity was output of GCMs not an input.

    I am also not convinced that mugwump’s ascertain that economy would adversely affected by action – it would certainly change and be bad for say coal miners and SUV manufacturers. However, impact statements would suggest that a great deal worse for the economy long term if do nothing so even if a short term hit was required, its better in long term.

    Comment by Phil Scadden — 16 Sep 2008 @ 9:38 PM

  377. 359 Jess:

    Your comments to Mugwump stole the words from my mouth. However, speaking as an investor-class parasite, how are my kind to employ our sucking financial mouth-parts in the absence of “growth”? “Growth” is the only mode of human existence that allows people like me to infinitely spend and spend again capital that was originally created with an actual measurable amount of work– preferably done by somebody else– merely by sitting in my easy chair and loafing. That’s why we’re so keen on “growth”, see? We parasites promote “growth” because without it we can’t survive.

    Who says perpetual motion can’t work? Or am I really just a dissipative structure? I’m certainly not conservative because I seem to be violating some fairly basic notions of conservation.

    Seriously, Mugwump should indeed follow “growth” to the logical conclusion and then try to imagine how things will work when further growth becomes impossibly constrained. Your true parasite does not really care, but those sincerely concerned about future generations should do.

    Comment by Doug Bostrom — 17 Sep 2008 @ 2:08 AM

  378. “The last assumption was that the whole earth was covered with a 1km thick layer of oil.”

    That would convert to 3.227.848.101.265.820.000 liters or 20.429.418.362.441.900 barrels. Way enough for everybody on earth to drive a SUV far beyond 2100. But your point is valid, nevertheless and I don’t think anybody assumes we’ll have oil forever.

    Comment by dagobert — 17 Sep 2008 @ 5:50 AM

  379. Sorry. Did the conversion to barrel twice. In fact it would be 3.227.848.101.265.820.000 barrel.

    Comment by dagobert — 17 Sep 2008 @ 6:00 AM

  380. mugwump writes:

    I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.

    Who is proposing “halting growth?”

    Comment by Barton Paul Levenson — 17 Sep 2008 @ 7:18 AM

  381. mugwump 370,

    “Think exponential growth in technology: eg, imagine billions of tiny nano-sequesterers released into the atmosphere that fall as inert soot when they are done.
    Once we really get going on molecular and nano machines, existing biology is going to look as limited as the abacus does next to my quad-core pentium.”

    This sounds just like the blind faith of some fundamentalist religious folk. You seem to have merely switched the word God with the word Technology.

    Frankly, such hubris and contempt for existing biology scares me.
    Do you really think your children will actually want to live in this cyborgian fantasy that you foresee ? I know I don’t.

    http://www.countercurrents.org/jensen010908.htm

    http://www.chrismartenson.com/three_beliefs

    Comment by CL — 17 Sep 2008 @ 8:43 AM

  382. The problem, mugwump, is that all that growth in wealth (and technology) ends up using a greater resource base each time to maintain it.

    Let me put it this way: we use resources more efficiently, but there’s a point where the returns fall off. For example, a hunter-gatherer society using stone tools needs a large area to operate in. But the impact is small, and the amount of stuff each person needs is small.

    A technological society that uses steel tools and agriculture uses a smaller area to live in, but they use more stuff. With metal tools, you have to mine the metal and move X tons of rock to get Y pounds of metal. You have to cut down a huge forest to provide however many acres to feed people. The population rises and the cycle goes again. Eventually you get resource constraints.

    Remember, new technologies don’t remove the need for resources a lot of the time–they just change which one you use. All are finite in that sense. We may not use stone tools anymore, but to supply metals we have mining operations that have permanently destroyed and damaged huge areas.

    Also, markets are powerful things, but they only develop stuff that’s profitable. Cars are more fuel efficient in Europe and Japan because the price of gas was kept expensive (you notice a price jump $6 to $7 a lot less than $2 to $3). But the markets didn’t do that, government policy did. Markets (as constructed now) assume all the costs they impose are external– they just aren’t counted. The most egregious example is slavery. The cotton and sugar markets never accounted for the huge, gigantic economic costs to populations imposed by capturing people in Africa (and generating a market for such) and shipping them to the Americas. Even now such things aren’t considered relevant to market calculations.

    You can’t predict what new technologies will appear, it’s true. But it’s precisely for that reason that when you are faced with decisions to make, you can’t just ride off and hope, you have to deal with what you have now. You use seat belts, right? They were invented in response to regulations designed to protect people in accidents. Car companies complained they cost too much at the time. Would you have said “well, in the future we might develop force fields and seat belts won’t be necessary?” “Keep smoking, we’ll develop a way to cure cancer instantly someday.” “make toxic toys for my kids, I’m sure the technology to cure them will catch up.” C’mon, mugwump, do you use lead paint in your house and give your kids chips to chew on on the premise that the technology will save them someday, maybe?

    Markets are poor systems for certain things because they just aren’t designed for it. The basic assumption for market economists is that everyone has complete information and behaves economically rationally, and that resources are infinite. That is clearly not the case — if it were bubbles would never happen and there would be no bid/ask spread on stocks (and the stock market would be static because everyone would know the proper price for everything already.

    Comment by Jess — 17 Sep 2008 @ 9:00 AM

  383. RE #375:

    You seem to be discounting the risks of inaction for some reason. As far as we can tell the chance of a large temperature rise under business as usual is much greater than the chance of one that we can ignore.

    That’s the million (quadrillion?) dollar question. From my reading we’re quite uncertain as to the probability of a large temperature rise. The problem is the two get conflated: an uncertain probability of a large temperature rise is not the same thing as a high probability of a large temperature rise.

    Anyway, I think I have worked out a way to tackle the nonlinearity question. I want to check the literature to see if anyone else has already done the analysis.

    Comment by mugwump — 17 Sep 2008 @ 9:13 AM

  384. There were some questions a while back in this thread on how we know that climate sensitivity is linear with temperature. I will give a quick derivation for radiative forcing, which has a well defined relationship between radiant flux and temperature. I apologize for the ASCII representation; latex rendering does not work currently.

    Start with the Stefan-Boltzmann equation, with ε representing emissivity.

    E=εσT4
    dE/dT=4εσT3
    (dT/dE)now = (Tnow)-3/(4εσ)
    (dT/dE)LGM = (Tnow – ΔT)-3/(4εσ)

    where ΔT = Tnow – TLGM

    (dT/dE)LGM = [(1 – ΔT/Tnow)-3]*(Tnow)-3/(4εσ)
    (dT/dE)LGM = (dT/dE)now*(1 – ΔT/Tnow)-3

    Since ΔT<<T (5K << 288K), we can make a binomial approximation:

    (dT/dE)LGM = (dT/dE)now*(1 + 3ΔT/Tnow)

    and hence the climate sensitivity due to radiative forcing can be approximated as linear for small changes in temperature. The 5% difference between LGM and today is miniscule compared with the ranges in climate sensitivity predicted from models, for instance.

    Comment by Jeff — 17 Sep 2008 @ 9:34 AM

  385. Lynn, EV conversion sounds like a decent idea. But removing an engine (with hoists, special wrenches, disconnecting parts, reaching those pesky rear block and engine mount bolts, etc.) if you’ve never even held a screwdriver sounds like a need for a hefty dose of truth in marketing!

    Comment by Rod B — 17 Sep 2008 @ 9:42 AM

  386. > I know what is better for my descendants

    And this too is a political position. Not science.

    Comment by Hank Roberts — 17 Sep 2008 @ 10:17 AM

  387. Re #373. The lack of warming in Antarctica is well understood I thought relative to the Arctic, being a continent and not being surrounded by other continents but cold dark ocean makes it take a lot longer for it to warm up. This is well understood and known.

    People always use Antarctica as if its the skeptics ultimate weapon but climate models do not show Antarctica warming up either, well not anytime soon anyway.

    Comment by pete best — 17 Sep 2008 @ 10:45 AM

  388. Mugwump, one thing we do agree upon is that in a sustainable economy, economic growth is driven primarily if not exclusively by technological advance. On the other hand, I think that it is a mistake to simply assume technology will be available when it is needed. Technology doesn’t just happen. It depends on there being a scientific infrastructure from which it can draw technical solutions, and our track record for investment in basic research has been deplorable since the ’60s. Technology takes time to develop, and the only way I know of to buy time is to cut ghg emissions.
    It is evident that you don’t reject the basic science. Given the fact that climate change is occurring and the way the uncertainties stack up, what would you consider to be reasonable mitigation?

    Comment by Ray Ladbury — 17 Sep 2008 @ 11:00 AM

  389. Rod B #351:

    Et al, since the supercomputer-run aerodynamic models have been thoroughly vetted and tested over and over again with millions of hours of physical lab type processes, which climate models haven’t and can’t, how can aerodynamic models be used as evidence per se of the validity of GCMs? Ala the argument that tells us if we don’t accept GCMs hook, line, and sinker, we’d better not ever board a plane.

    Rod, they’re called GCM (Global Circulation Models) for a reason: they are about modelling climate and weather. To satisfy your curiosity do a google on “GCM weather testing”. For me it turned up straightaway three interesting articles on using weather situations to test and improve the “engine room” of these models, which is pretty much common to both weather and climate.

    Dunno about “millions of hours”, but methinks quite a lot of “weather hours” have gone and are going into the development and fine tuning of these “climate” models.

    So yes, the analogy between aerodynamic modelling and GCMs seems valid to me. Do remember though that the aerodynamic models are not perfect either — this is taken care of in engineering by this thing called “safety margins”. Something to consider.

    Comment by Martin Vermeer — 17 Sep 2008 @ 11:10 AM

  390. http://meteora.ucsd.edu/~jnorris/sio117/ipcc_figs/IPCC_9.20small.bmp

    IPCC estimates of the probability distribution for climate sensitivity.

    From this collection:
    http://meteora.ucsd.edu/~jnorris/sio117/ipcc_figs.html

    Linear? Published where?

    Comment by Hank Roberts — 17 Sep 2008 @ 11:12 AM

  391. RE Rod B, #385, that’s why they have EV clubs, to help out with that, and why I wouldn’t think of making a conversion without club support. If I recall the Fox Valley EV club met at a community college, and perhaps they use the auto-shop facilities there….

    I guess what we need more than anything else is for the “CAN’T-DO” attitude to be converted to a “CAN-DO” attitude, but something tells me that it will take more than hoists and special wrenches for that herculian task. However, I’m certain it can be done.

    Comment by Lynn Vincentnathan — 17 Sep 2008 @ 11:17 AM

  392. “The problem, mugwump, is that all that growth in wealth (and technology) ends up using a greater resource base each time to maintain it.”

    I don’t see that. For centuries now, growth and wealth has increasingly been achieved thru technology rather than other resources. I’ll admit that it doesn’t feel good to depend on the faith in future developments but we must make sure that our efforts to avoid future catastrophe does not end in endangering our current society and by that keeping ourselves from making those crucial developments. Its a very narrow corridor our politicians have to manouvre in and they should focus on the most effective things rather than those yielding little result but require much effort. I always get the feeling, that especially the “oil and car” talk is misleading. We do have technology to do without coal today – and in the long run, coal is a far greater problem than oil (just my 2ct and sorry for my bad english)

    Comment by dagobert — 17 Sep 2008 @ 11:40 AM

  393. Re: 373.

    So what’s been melting all that ice then? Rudolf’s red nose?

    Comment by Mark — 17 Sep 2008 @ 11:56 AM

  394. ‘mugwump’ in comment 381 says,

    tiny nano-sequesterers released into the atmosphere that fall as inert soot when they are done.

    Soot — carbon particles — would not be inert; they would eventually reoxidize and their carbon would be back in the air.

    Small sequestering particles have already been demonstrated. They can work in the air but also on the ground or a short distance under it, and when they are done they are carbonate particles, and they really are inert.

    (Actually, as carbonate particles, they may be only half done. They may depart from inertness to the extent of dissolving in the sea. This definitely wouldn’t result in their letting any of their captured CO2 go back into the air. There has been dispute in these pages over whether it will cause them to take more out, but this additional effectiveness doesn’t seem essential to me. If it exists, it’s a bonus.)

    Comment by G.R.L. Cowan, H2 energy fan \'til ~1996 — 17 Sep 2008 @ 1:07 PM

  395. Mugwumps: The 2010 Prius is estimated to get 94mpg. That’s using a Japanese-style programming (they tune the electronics differently for different markets), and stuff is never as good as predicted, so let’s drop it to 67mpg for a large midsize vehicle. (the 2010 Prius is far larger, faster, and better than the current version.) The US’s fleet gets 22mpg. Direct challenge: how does tripling mileage for automobiles by using cheap current technology (Atkinson cycle engine, some electronics, and way-proven NiMH batteries) inhibit the economy? Dropping gas cost in half (remember supply and demand) would allow for the transfer of taxes from income and property to fuel without any price increase at the pump – that’s dropping net per mile fuel costs by 5/6ths while reducing transportation carbon emissions by almost 2/3rds! Please explain your stance that such policy would decrease the growth of the economy. The whole board awaits.

    Lynn, EV conversions aren’t nearly as good an idea as just buying a fully-designed vehicle. The ZENN for non-interstate driving is a good example and available now. The 2010 Prius is the gold standard for a modern vehicle (if you can wait a year) One-off constructions almost always cost more in carbon and money than mass-produced, so while you’d be making a statement, you’d be doing little or negative good unless you can religiously avoid the interstates. High speed driving destroys EV batteries. 25mph is about the practical limit for current EVs. That’s why the Chevy Volt is slated to cost $40k – and that’s pr-speak. Actual cost without corporate or federal subsidies is probably far higher. (and the Volt will only get 50mpg – it’s far behind the curve in hybridville)

    On feedbacks – many posts give 1/(1-f). That seems wrong to me. Many (or most?) big feedbacks are based on ice, and ice exists at -1C and doesn’t at 1C, so we’re talking one-off pulses, especially since the regions of ice left are pretty discrete. Is this right?

    Comment by RichardC — 17 Sep 2008 @ 1:47 PM

  396. Lloyd: “If you want to wait for more certainty, well certainty will be expensive. This is a cumulative problem and generally the sooner we act the smaller the total cost will be.”

    Like many here, you are disagreeing with the consensus of peer reviewed professionals (economists) on this point. Other than a small carbon tax (less than current US gas taxes), most economists think drastic regulations on carbon emissions at this point would be very bad policy.

    Moderator, a question on comment moderation: why do some comments take so much longer than others to be approved? My comment 355 to Ray seemed to take 24 hours longer than other comments at a similar time.

    Comment by Steve Reynolds — 17 Sep 2008 @ 2:11 PM

  397. [BTW, I took advantage of the temporary glut in inventory to pick myself up a very nice, brand new SUV at a bargain basement price, and now gas prices are falling again :-)]

    Comment by mugwump — 16 September 2008 @ 8:02 PM

    Why only one?! You could have bought two and saved twice as much! Modern technology will most surely come to your rescue. Perhaps quantum mechanics will become applicable in the macro world and you’ll be able to go from point A to point B in your SUV by means of quantum tunneling. The price of oil be damned. :-?

    Comment by Lawrence Brown — 17 Sep 2008 @ 4:10 PM

  398. RE #382 Jess:

    Remember, new technologies don’t remove the need for resources a lot of the time–they just change which one you use.

    And a lot of the time they do remove the need for resources. 15 years ago the computer on my desk took up an entire building.

    Also, markets are powerful things, but they only develop stuff that’s profitable.

    Exactly my point: when it becomes unprofitable to make plastic from scarce petroleum, the market response will be to find replacements. The example I gave of SUV demand destruction in the face of $4 gas is very cogent. Consumer demand is like a tidal wave. When the economics fit, that demand can force change overnight. Trying to force change counter to consumer demand is like pissing into the tidal wave to hold it back.

    C’mon, mugwump, do you use lead paint in your house and give your kids chips to chew on on the premise that the technology will save them someday, maybe?

    Two things make your analogies fail:

    1) we know the consequences of munching on lead whereas climate sensitivity is still very uncertain;

    2) bad consequences of climate change are probably a very long way off, whereas my kids will suffer from lead poisoning immediately.

    A very uncertain threat a long way off has all kinds of possible solutions that munching on lead or smoking cigars do not.

    Comment by mugwump — 17 Sep 2008 @ 4:13 PM

  399. RE #388:

    Direct challenge: how does tripling mileage for automobiles by using cheap current technology (Atkinson cycle engine, some electronics, and way-proven NiMH batteries) inhibit the economy?

    It depends. Last time I checked the pious did not pay back it’s extra price in gas saved. It’s also pretty difficult to visit the grandparents with 5 kids in the prius. But I have no problem if people want to buy them: it’s still a free country.

    Comment by mugwump — 17 Sep 2008 @ 5:10 PM

  400. Since we’re talking solutions, here’s two:

    Cars: rear wheels connected via CVT to a flywheel. The computer adjusts the CVT and clutch to keep enough power in the flywheel to take the car to 80 (random number) mph in about 4 seconds (random number), so at 80mph no power would be stored and at stop, a lot of power stored. Front wheels driven by two gasoline or diesel motors – one has a bout 20HP and the other perhaps 40HP. The three power systems combine, without batteries, for perhaps 150mpg in a full-size vehicle that can go from 0-60mph in 4 seconds.

    Power plants: Build a cheap box out of depleted uranium and fill it with thorium and add U235 or Plutonium seeds. Design so that with 0 output (hot summer day heat dissipation is just *barely* enough to keep it from melting down. Add piping to route input air for a coal or natural gas power plant (existing is fine, new construction is grand) So when lower power is required (almost all the time) little or no fossil fuel is burned, and when peaks are needed, only a bit of fossil fuel is needed. NO control rod, NO safety systems, NO other expensive devices are required and it is completely retrofittable to existing plants. Net result is a power plant that runs at a super-high temp/efficiency, burns almost no fossil fuel and has none of the traditional issues with nukes either.

    There ya go, the solution to the world energy problem in a couple paragraphs…. Just engineering to make it so. Anyone here see a flaw?

    And appropriately, capcha is other nursery.*

    Comment by RichardC — 17 Sep 2008 @ 9:05 PM

  401. Steve Reynolds wrote on 17th September 2008 at 2:11 PM

    “You are disagreeing with the consensus of peer reviewed professionals (economists)…”

    these would be who?
    not the same ones who are currently overseeing the meltdown ?
    or…

    Comment by sidd — 18 Sep 2008 @ 12:26 AM

  402. Steve Reynolds, #396

    You are in error. Most anti-AGW economists alarm people like you with that statement.

    The Stern report shows differently.

    And in case you decide that report is partisan, there have been no economists taking up where it gets it wrong.

    Comment by Mark — 18 Sep 2008 @ 2:42 AM

  403. #395 Richard C,

    On feedbacks and ice sheets – yes the ice sheets that played a role in the climate response to the rise out of the last glacial minimum are not a factor now.

    However sensitivity is stated in degC / Watt per square metre. So the absence of ice-sheets, another forcing such as CO2 still has a temperature change impact related to it’s atmospheric concentration.

    The issue of risks like ocean CO2 outgassing or clathrate/permafrost CH4 outgassing does not imply a higher climate sensitivity. In leading to more CO2/CH4 in the atmosphere those factors would not imply a higher temperature rise for a specified forcing increase due to CO2/CH4. They would make impacts more serious of an because unforseen increase in atmospheric concentration of CO2/CH4 (hence higher temperatures and more impacts), but the sensitivity would remain the same.

    Mugwump,

    If we follow your counsel then by the time we find out the situation is serious, it will be very probably be too late to avoid dangerous impacts. Even if at such a realisation we stopped all emissions we’d still be a way off equilibrium. You may be happy with the equivalent of driving a car with an increasing time-lag between the controls and the affect on the car. I for one am not.

    Also you claim “bad consequences of climate change are probably a very long way off” whilst at the same time stressing the uncertainty that means we need do nothing. It seems to me your measures of probability are so subjective that they, along with your counsel, aren’t worth bothering with.

    Comment by CobblyWorlds — 18 Sep 2008 @ 3:15 AM

  404. “…the planet is facing a growing environmental crisis, caused largely by climate change, Fingar said. By 2025, droughts, food shortages and scarcity of fresh water will plague large swaths of the globe, from northern China to the Horn of Africa.

    For poorer countries, climate change “could be the straw that breaks the camel’s back,” Fingar said, while the United States will face “Dust Bowl” conditions in the parched Southwest. He said U.S. intelligence agencies accepted the consensual scientific view of global warming, including the conclusion that it is too late to avert significant disruption over the next two decades. The conclusions are in line with an intelligence assessment produced this summer that characterized global warming as a serious security threat for the coming decades.”

    http://www.washingtonpost.com/wp-dyn/content/article/2008/09/09/AR2008090903302.html

    Comment by CL — 18 Sep 2008 @ 4:00 AM

  405. mugwump writes:

    bad consequences of climate change are probably a very long way off,

    The Australians don’t think so.

    Comment by Barton Paul Levenson — 18 Sep 2008 @ 4:38 AM

  406. “The three power systems combine, without batteries, for perhaps 150mpg in a full-size vehicle that can go from 0-60mph in 4 seconds. … Anyone here see a flaw?”

    Yes. 150mpg in a full-size vehicle carrying around 2 diesel engines and a flywheel and providing enough traction for 0-60 in 4 seconds is not only very expensive to build but physically impossible. In a perfect world, 1 liter of diesel converts to less than 10kWh. The full-size verhicle would have to move in a near vacuum.
    I still agree that many things can be done but I see a lot more potential in large scale installations being influenced by equally large scale political decision (like switching from coal to nuclear) rather than hoping for the average chinese to be able to afford (and want) a fancy green car.

    Comment by dagobert — 18 Sep 2008 @ 4:50 AM

  407. Re #399, Don’t have 5 kids then maybe.

    Comment by pete best — 18 Sep 2008 @ 5:04 AM

  408. mugwump:

    But let’s be realistic. China and India are looking at catapulting their children from abject poverty to middle-class in a single generation. It’s very unlikely they’d be willing to jeopardize that for even a 4.5C rise, since the benefit to them far outweighs the cost.

    Burn away folks. It’s only being realistic.

    Comment by Chris O\'Neill — 18 Sep 2008 @ 6:58 AM

  409. mugwump:

    I have followed your postings and (as a layman) I find them interesting, although I can only partially follow the science in them. But still, I would like to comment on them and hope you have time to respond.

    #335:

    We’re probably fine if it is 2

    Is this based on something? It seems you are trusting your ‘gut feeling’, but shouldn’t you distrust that? Isn’t that what science is all about? After all, the theory of relativity goes completely against any common-sense-every-day-logic, and so my gut feeling says it’s a load of bollocks.

    GCM’s are made exactly to determine climate sensitivity without ‘gut feeling’. Yet you distrust the scientists that build them, saying they somehow tweak the parameters to get the desired results. In other words: you think the emotions of the climate scientists influence their results. But here you are completely basing your conviction on some feeling of ‘we are probably alright if climate sensitivity is below 2′. This baffles me, especially from someone who (I’ll have to take your word for it) is a scientist himself.

    Your quest for the GCM-less determination of climate sensitivity looks to me like the FDA declaring clinical trials unreliable and demanding a farmaceutical company for mathematical proof of the harmlessness of a new drug.

    The climate is influenced by so many factors, if you start to include them all in your ‘simple’ calculation, what you end up with is something that you would call a ‘model’. Actually, this is what has happened in the climate science community over the past 150 years or so. It started with a simple calculation, to which more and more factors were added until this day in which we have a few dozen GCM’s.

    Comment by Anne van der Bom — 18 Sep 2008 @ 8:30 AM

  410. Not a particularly important point, but the “programming” involved is in regard to the mileage *test* used, not the car itself. Different countries test different driving profiles, and the profiles used to determine official mileage in Japan is more hybrid-friendly than the current test used here (which is less hybrid-friendly than our own test used a few years ago).

    Comment by dhogaza — 18 Sep 2008 @ 9:14 AM

  411. Jess, there is Uranium on every terrestial body in the solar system and its scattered throughout the universe. There is plenty of energy to go around on current technology.

    Also, if we get fusion we can work with the substantial portion of the entire #$%#$%$ universe that is hydrogen.

    The only energy crises is artificial, caused by creepy environmentalists who are so afraid to die alone they want to create a mass death scenario where everyone has the comfort of a communal end.

    Comment by Mick — 18 Sep 2008 @ 11:46 AM

  412. Mick, there’s thousands (millions?) of tons of gold in the ocean.

    Go help yourself to a few kilos.

    Comment by Mark — 18 Sep 2008 @ 12:34 PM

  413. 1) For some reason, this thread is now being taken over fantasies about energy & growth.

    2) For a good discussion of climate & energy by someone who actually has a grip on reality, I suggest(Nobel physicist) Burton Richter’s Gambling with the Future. He comments on fusion for example…

    3) As for indefinite growth at our current rates, fueled by ever-better technology, that’s often assumemed by people who’ve had little to do with creating advanced technology, for whom such is indistinguishable from magic.

    More likely, the following biophysical economists have a much better handle on the impact of energy on economics, not just technology:

    Charles Hall, EROI at TheOilDrum. There isn’t any EROI=100 oil left… Study the balloon graph and think about it.

    Charles Hall, home page, read talks, worry whethere neoclassical econ is right or not

    Robert Ayres, Economic Growth (and Cheap Oil). and look for book “The Economic Growth Engine”, by Ayres & Benjamin Warr, ~Feb 2009.

    They claim a substantial part of economic growth comes from energy, or more precisely work = energy*efficiency. For a century, we’ve had more energy, but world oil production looks to be on the bumpy plateau of it’s Peak already, and it’s a few decades until natural gas does it. On Hall’s graph, we need to replace the oil+gas, and cut down the coal way down. That’s ~75-80% of our current energy supply. Good luck to descendants…

    4) All this *can* be dealt with, but it needs some political will to enter the 22nd Century with a 22nd Century economy&energy infrastructure, rather than a badly-broken 20th Century economy littered with stranded assets that don’t work any more. It would have helped had we started a few decades ago.

    Comment by John Mashey — 18 Sep 2008 @ 12:41 PM

  414. Speaking of “new” technologies…

    12-year-old Revolutionizes the Solar Cell

    “William Yuan, a seventh-grader from Portland, OR, developed a three-dimensional solar cell that absorbs UV as well as visible light. The combination of the two might greatly improve cell efficiency. William’s project earned him a $25,000 scholarship and a trip to the Library of Congress to accept the award, which is usually given out for research at the graduate level…”

    http://blog.wired.com/geekdad/2008/09/12-year-old-rev.html?npu=1&mbid=yhp

    IMHO, arguments surrounding putting off addressing GHGs and their effect upon climate based on vague arguments surrounding the need to “develop” new technologies is something of a non-sequitur. Historically, most technologies tend to be developed in response to perceived need. It follows then that if we begin to address a need, said technologies will invariably be forthcoming, as young Mr. Yuan has demonstrated.

    Comment by J.S. McIntyre — 18 Sep 2008 @ 12:58 PM

  415. This is one in a series of superb posts, but I learned fron engineers to simplicate and add lightness wherever possible.

    SIMPLICATED PRESENTATION.

    For us non-scientists the two damned uncomfortable facts are:

    - We have reliable measurements of the concentration of carbon dioxide in the air and they are creeping upward year by year.
    - For all concentrations worth thinking about, more carbon dioxide in the air means more energy trapped in and near the surface of the earth.

    The surface of the earth is a rotating, rough ball with a coating of air and water. The air and water flow around in very complex patterns driven by the energy retained. We can expect that:

    - More energy retained will mean that the air and water will be warmer, on average.
    - More energy retained will make the air and water flows more variable, more turbulent, and more likely to produce local extreme conditions.

    That is more carbon dioxide in the atmosphere means that we must expect our climate to get warmer, more varible and more extreme.

    That’s it. It is about as sure-fire as the idea that if you press on the accelerator the car you have never driven before, it will go faster.

    OK, you want to know which things are going to get how much warmer when and where. You want to get an idea of where and when the weather may wreck the crops or the storms remove your house. That is what the wonderfully complicated, but still very approximate, climate models are about. For now think of them as more reliable than today’s models of the economy, but no more reliable than the weather forecasts were 50 years ago. Let someone else argue about how to improve the models; but remember that what you want out of them are better and better estimates of the odds on how badly (or well) things may turn out.

    Since you are human, you ask what can we best do about it. The guys are still working on that. Probably what you and I can best do about it is put presure on and money in to come up with good, effective, workable answers.

    Since you are curious, you want to know how that extra carbon dioxide got in the air. The scientific detectives say that most all the evidence points to us and our ancestors having put it there by burning carbon we dug out of the ground. But wherever it came from, the question is: what we do about it now?

    All the rest is interesting, but not essential.

    Comment by Diversity — 18 Sep 2008 @ 1:37 PM

  416. #388 Mugwumps talks of a 5 year old design instead of the new one, but I’ll go for it. With 10,000 miles a year, savings for increasing mileage from 23 to 46mpg at $4 a gallon is $870 a year. Say you own the car for 3 years. That’s $2600 savings. A 2005 Prius in good condition is $22000 VS a new 2008 price of $22,220. A 2005 Camry in good condition is $13,430 VS a new price of $20,340. Add up the numbers, and the consumer saves $9290 over three years by buying a Prius. Subtract about $290 for interest and $2k for whatever, and it nets out at a $7,000 savings, not including the savings of time by filling up less frequently. (Cost numbers from Kelly Blue Book) Mugwumps, how is a $7000 savings over 3 years unwise economically? Did you forget resale value and that gas prices aren’t $1 a gallon anymore? My condolences on your blunder with the SUV. The market forces which keep a used Prius at new-car value also destroy large SUV resale value. The 2005 H2 lost 53% of its value. A 2008 monster SUV should lose 80% or so in 3 years. Tis scrap metal you’ll own, once Prius-style monster SUVs hit the market.

    Besides, you ignored the increase in oil prices caused specifically by the choice of three corporations to build crap cars, ignored the military cost to “secure” Mideast oil for the crap cars, ignored the absolute tanking of value in Detroit stock (rolling and capital) and evaded the question entirely. Obviously it’s not a free country when it comes to vehicles. People are screaming for efficient cars, and Detroit ignores them. Lynn, a woman without mechanical experience is considering trying to build her own car!

    Quit evading and answer the direct question. How does tripling the efficiency of the nation’s fleet harm the economy? Include all effects such as the obvious halving of gasoline prices and the elimination of the military cost of oil. Your Marie Antoinette style response doesn’t cut the cake. That larger cars are needed by some isn’t relevant. The larger the car, the bigger the effect is, so quadrupling a large SUV’s efficiency is what’s on the table.

    #396 Steve, it depends on how the question is phrased. I can’t fathom any economist saying that tripling efficiency at negligible capital cost would harm the economy. My guess is that the folks you are [not] quoting are pundits, not economists. We’re talking specific actions with specific results. Hybrid and tribrid technology for vehicles, and nuclear pre-heaters for fossil fuel plants. Both save tons of money and reduce emissions by 60% or more as compared to the current machines. Please ask any economist if such paradigm shifts would harm the economy. Be sure to ask if dropping oil prices back to $25 a barrel would harm the economy.

    #398 Mugwumps, you’re right, plastics manufacture does not require fossil fuel. It can be made quite easily from non-fossil hydrocarbons. I also agree with you on tech trajectory. Nano and bio will result in huge paradigm shifts. However, your economic analysis fails because it excludes externalities, including the inherent insurance cost of taking a path. If CO2 emissions have a 10% chance of costing x, then x/10 must be charged by society to compensate. The numbers are so huge that x/10 is the largest component of the price. By excluding the largest component, capitalistic forces can’t function properly. Besides, why not save gobs of money by dropping CO2 emissions by 3/4ths? There is only one answer: vested interests in obsolete systems would lose profit. Profit is NOT the goal of capitalism; reduced end-user cost is. Since individuals can’t build or influence the building of complex modern systems like cars, the whole economic theory shreds, and we end up with push-sold vehicles that are only desired because of TV ads, discounts, and fear. Since there are Hummers out there, folks feel they need equivalent weapons to survive. You’re stuck one equation behind.

    And you’re right, we don’t know whether the results of our little CO2 experiment will be merely very bad, or fully cataclysmic. It’s stupid to continue down a path that is guaranteed to a 95% confidence level to do very bad things simply because we don’t know how horrid it will be. This year’s Arctic ice data is conclusive. All forcings are negative. Solar output is at dead minimum. Zero sunspots, with an average near zero for two years now. Weather conditions optimal for ice buildup, yet the ice volume probably still declined past 2007′s level. Once the sun comes back online and the weather shifts, the Arctic will melt away. Once that happens, Greenland will work on sea level, and Siberia will get busy with CH4 production. Since we’ll save gobs of money by beginning to get off the carbon habit (supply and demand all by itself dictates that), I can’t fathom your position.

    Comment by RichardC — 18 Sep 2008 @ 2:17 PM

  417. RE #407:

    Re #399, Don’t have 5 kids then maybe.

    Too late pete. Which ones would you like me to sacrifice?

    RE #409:

    My gut feeling about 2C sensitivity being ok derives from the fact that we’ve already seen about a 1C rise over the past century or so and the sky has not fallen in. Also, in terms of humanity’s global impact, a 2C increase in temperature from GHGs is the least of our sins.

    RE #413:

    As for indefinite growth at our current rates, fueled by ever-better technology, that’s often assumemed by people who’ve had little to do with creating advanced technology

    Let me guess John, you once had something to do with creating advanced technology? Some of us on the other side do too. I suspect whether one thinks technology can drive growth indefinitely has a lot more to do with one’s political outlook than anything else (rightward leaning optimists versus leftward leaning pessimists).

    I doubt people are claiming indefinite growth at current rates (I certainly am not). When 3/5 of the world is making the transition from 3rd world to 1st world, global growth rates are obviously huge (there’s nowhere to go but up). Once we’re all first world, we’ll likely witness long periods of “punctuated equilibria” where the punctures are caused by significant technological breakthroughs.

    4) All this *can* be dealt with, but it needs some political will to enter the 22nd Century with a 22nd Century economy&energy infrastructure, rather than a badly-broken 20th Century economy littered with stranded assets that don’t work any more.

    The economy is doing fine as far as I can tell. Living standards are rising rapidly across the globe. The main places where things are still relatively moribund are those states afflicted with central planners who think they know what is best for the rest of us…

    It would have helped had we started a few decades ago.

    Thank goodness we didn’t.

    Comment by mugwump — 18 Sep 2008 @ 2:18 PM

  418. It follows then that if we begin to address a need, said technologies will invariably be forthcoming, as young Mr. Yuan has demonstrated.

    The magical pixie dust of “need”.

    “Invariably” is overly optimistic and doesn’t match your own “most” and “tend”. And a single invention doesn’t “demonstrate” any kind of invariability. Advance tends to beget advance, but remember that the blight of the Middle Ages followed hundreds of years of Roman know-how. Things do go backward from time to time.

    Comment by Jeffrey Davis — 18 Sep 2008 @ 2:26 PM

  419. If we suppose the point of getting gold is to have gold, as bankers’ habit of reburying much of what is dug up would suggest, then seawater is not an economic gold ore: one would have to sell more gold than would be acquired. If element-92 acquisition were similarly pursued for its own sake, acquiring it from seawater would, again, require selling more terrestrially mined element-92 than the seawater extraction would give. However, if the purpose of the marine extraction, as recently demonstrated, had been to get energy, then it would probably have paid off. Those hydrocarbon ropes net many times their own energy of combustion with each soaking, and if their submergence is deep, so that sunlight doesn’t degrade them, they can fetch many loads.

    Seawater [Au] is 1/300 of seawater [element-92].

    Comment by G.R.L. Cowan, H2 energy fan 'til ~1996 — 18 Sep 2008 @ 3:08 PM

  420. OT: Looking for community response to Douglass, D.H., and J.R. Christy, 2008: Limits on CO2 Climate Forcing from Recent Temperature Data of Earth.

    http://arxiv.org/ftp/arxiv/papers/0809/0809.0581.pdf

    [Response: We should have a competition for the largest number of hidden (and invalid) assumptions that can be found in ten minutes browsing. Another embarrassing own goal. - gavin]

    Comment by Walt Bennett — 18 Sep 2008 @ 3:54 PM

  421. Mark: “The Stern report shows differently.”

    The Stern report was not peer reviewed.

    Mark: “And in case you decide that report is partisan, there have been no economists taking up where it gets it wrong.”

    I suggest you look into that. A starting point:

    http://nordhaus.econ.yale.edu/stern_050307.pdf

    Wikipedia has plenty more similar references.

    Comment by Steve Reynolds — 18 Sep 2008 @ 4:04 PM

  422. Barton Paul Levenson (405) — Nor do the

    Bolivians and Perueans;
    Anadaman Islandsers and Bangladeshi;
    various and assorted Pacific Islanders.

    Nor should those living around the Mediterrean and Black Seas.

    Comment by David B. Benson — 18 Sep 2008 @ 5:12 PM

  423. John Mashey: “All this *can* be dealt with, but it needs some political will to enter the 22nd Century with a 22nd Century economy&energy infrastructure…”

    I will agree that far, but the political will needs to be to stop making political decisions about winning and losing technology (despite how attractive and lucrative having that power is to politicians).

    I looked at your reference to Charles Hall. I got to the end of the long PowerPoint presentation; I’ll agree that recycling paper is bad.

    Comment by Steve Reynolds — 18 Sep 2008 @ 6:16 PM

  424. RE #420: My first reaction is they seem to have radically underestimated the uncertainty in the trend.

    Comment by mugwump — 18 Sep 2008 @ 7:07 PM

  425. Even If Greenhouse Gas Emissions Hold Steady, Warmer World Faces Loss Of Biodiversity, Glaciers:

    http://www.sciencedaily.com/releases/2008/09/080917145509.htm

    ————-
    Steve Reynolds (423) — Recycling paper is bad?

    Comment by David B. Benson — 18 Sep 2008 @ 7:23 PM

  426. Martin (389), true (the safety margin thing) but don’t forget the major physical validation that goes on with aerodynamic models, which can’t be done with GCMs (and others, stellar and cosmology, e.g.). Also, some truth in the validation of circulation models; but that ignores the great lengths that climatologists go to disconnect weather from climate models; actually not great lengths per se but the reasons for the great lengths — simply that they are different.

    Comment by Rod B — 18 Sep 2008 @ 7:50 PM

  427. No plant CO2 relief in warm world:

    http://news.bbc.co.uk/2/hi/science/nature/7620921.stm

    so the future climate sensitivity may be higher than anticipated.

    Comment by David B. Benson — 18 Sep 2008 @ 8:00 PM

  428. mugwump, 417

    “The economy is doing fine as far as I can tell.”

    Erm, I thought we are in an unprecedented economic crisis.

    Or is it that I am just poorly informed and out of touch ?

    http://organizationsandmarkets.com/2008/09/17/the-financial-crisis/

    http://www.leap2020.eu/GEAB-N-22-is-available!-Global-systemic-crisis-September-2008-Phase-of-collapse-of-US-real-economy_a1298.html

    http://www.countercurrents.org/grey180908.htm

    http://www.countercurrents.org/lendman180908.htm

    ‘shortstop stableboy’. Sounds like a horse raceing tip

    Comment by CL — 18 Sep 2008 @ 8:10 PM

  429. Mugwump–re 413: Google John Mashey.

    Technology on demand has not been my experience in life. Simply assuming it will be there when we need it sounds a wee bit too much like the “success driven schedules” proposed by some project managers I’ve known. I agree that increases in technology represent the only real (read “nonextractive”) growth. I agree that technology is the only way we will get out of the mess. I am simply not so sanguine that the technology will come along quickly. It will take a massive development effort.

    And while you may think we’re fine with only 2 degrees per doubling, I would suggest you consider what happens when the permafrost melts and belches out a bunch of CH4 or when the clathrates start to thaw or when the North pole is actually ice free. There are lots of uncertainties. They do not favor complacency.

    Comment by Ray Ladbury — 18 Sep 2008 @ 9:17 PM

  430. Re mugwump @417: “The economy is doing fine as far as I can tell.’

    Don’t listen to the news much, eh?

    OK, I’ve read enough of what you have to say. It’s now quite clear that as with almost all delayers/deniers/skeptics/contrarians (pick your own label) it’s not really about the science at all for you. What it’s really about is how the notion of taking action to deal with anthropogenic climate change is at odds with your own political ideology. Which means you really have nothing more to contribute to the discussion but more political rhetoric, and I can get plenty of that elsewhere, if I were at all interested. I’m not.

    Comment by Jim Eager — 18 Sep 2008 @ 10:03 PM

  431. re 418. Jeffery, thank you for your comments.

    An apology. Not to quibble, but it appears you are missing the point. Mr. Yuan’s achievement is illustrative of what happens when people apply themselves to a problem (re a “need”). While you could argue that his solar cell is more an example of innovation as opposed to original invention, this does nothing to change the real point that he was responding to a perceived problem and providing a solution. In short, I was holding him up as an example of what we’re beginning to see in terms of responses to the problem. (And what he has done is not insignificant, else one has to wonder why a seventh grader would receive this award in recognition of what he is done, an award normally handed for research at a graduate level.)

    You write: ““Invariably” is overly optimistic and doesn’t match your own “most” and “tend”.”

    I will repeat – we invariably tend to address needs, whether on a personal, individual level, or on a societal level. There is nothing Pollyannish about this observation; I certainly am not making the specious argument often heard from the anti-AGW community of skeptics that we will “innovate” our way out of this problem. Nor am I suggesting the innovations/inventions/solutions will come in the nick of time, an argument against which seems to underscore the tone of your response to my comments.

    Quite to the contrary: from where I sit, we may have run out of time to formulate a response that will “solve” the problem. Or, considering the foot-dragging that AGW skepticism helps to facilitate, and given the prevalent lack of urgency that characterizes the response to AGW in many quarters, by the time we come to our senses, these “responses” will be too late, because the will and desire to implement them was lacking when they could have made a difference. Instead, more and more it seems, at least from my perspective, the conversation these days is morphing from ”this is how we can “stop” or “reverse” AGW”, and becoming, instead, a conversation focused on how well we’re going to be able to adapt to a changing environment.

    In summation, you are right; solutions may come too late. But that was not my point; and goes off-track from the real focus of my posting, which was to suggest the argument that we should wait for technological innovation before we do anything is contrary to what we see as a something of a norm in human history. I certainly wasn’t discussing absolutes as you seem to infer; I thought that rather clear..

    Comment by J.S. McIntyre — 18 Sep 2008 @ 10:17 PM

  432. mugwump

    My gut feeling about 2C sensitivity being ok derives from the fact that we’ve already seen about a 1C rise over the past century or so and the sky has not fallen in.

    The fact that the first 1ºC did not make the sky fall in doesn’t prove that the second 1ºC will be equally mild on us, and the third, fourth, etc.

    You confirm that on this scientific matter your opinion is based on your gut feeling. How much of a scientist are you?

    Comment by Anne van der Bom — 19 Sep 2008 @ 2:07 AM

  433. Mick posts:

    The only energy crises is artificial, caused by creepy environmentalists who are so afraid to die alone they want to create a mass death scenario where everyone has the comfort of a communal end.

    You forgot to mention the UN black helicopters.

    Comment by Barton Paul Levenson — 19 Sep 2008 @ 3:04 AM

  434. RE #432:

    The fact that the first 1ºC did not make the sky fall in doesn’t prove that the second 1ºC will be equally mild on us, and the third, fourth, etc.

    Right, the second 1ºC almost certainly won’t be equally mild. But it probably won’t be a big deal either. As for the third – that’s in the grey area. The fourth? I already said I believed 4 degrees quickly would likely require a lot more action than at present, in which case we had better get China and India on board which I doubt we can do.

    You confirm that on this scientific matter your opinion is based on your gut feeling. How much of a scientist are you?

    One of the best, thankyou for asking. There are few reports out there that another 1ºC of warming will be a disaster.

    Comment by mugwump — 19 Sep 2008 @ 4:49 AM

  435. RE #430 Jim Eager:

    It’s now quite clear that as with almost all delayers/deniers/skeptics/contrarians (pick your own label) it’s not really about the science at all for you. What it’s really about is how the notion of taking action to deal with anthropogenic climate change is at odds with your own political ideology.

    Rubbish. Tackling AGW on the basis of wildly uncertain information is at odds with common sense.

    [Response: So tackling economic crises on the basis of wildly uncertain information is at odds with common sense too? You have it completely wrong. Decisions are almost always made with uncertain information (we could dispute the 'wildly'). - gavin]

    Comment by mugwump — 19 Sep 2008 @ 4:56 AM

  436. RE #429 Ray:

    Technology on demand has not been my experience in life.

    I am not arguing for “technology on demand” as such. I am arguing that the technological progress of the last 150 years teaches us that technology of 150 or 200 years hence will be so vastly superior to today’s that it is crazy not to take that into account.

    In particular, we are on the cusp of a genetic engineering revolution that will rival the computer revolution in it’s breadth and growth rate. We’re barely at the vacuum-tube stage today.

    Comment by mugwump — 19 Sep 2008 @ 5:05 AM

  437. ReL #420

    Gavin,

    Thanks for the response. Of course, being a layman, I have no way of knowing which invalid assumptions you refer to. I rely on experts to critique other experts in language which is accessible to me.

    I have little doubt that the Christy paper has flaws; else, why go to E&E with it, a la Beck? I was highly informed by critical response to that paper, in this very space as well as elsewhere.

    Since at least Lindzen also downplays the forcing of increased CO2, I would say that there is some weight to the theory. It may, in light of day, have no merit. In that case, shine the light.

    Comment by Walt Bennett — 19 Sep 2008 @ 7:00 AM

  438. David B. Benson, 427

    That study was total dross. One species, change one factor in an already harsh environment. The only
    thing you can learn from that is to ignore all BBC reporting. Completely failed to mention what might happen where water isn’t in short supply.

    Jim Eager, 430
    “It’s now quite clear that as with almost all delayers/deniers/skeptics/contrarians (pick your own label) it’s not really about the science at all for you”.
    As someone who would fall into your rather narrow caategorisation, i’d like to say it is most definately about the science. Having spent the last year or so trawling through the internet looking at various
    research and summaries to try to get a handle on the current state of knowledge. I’ve concluded that a) we don’t know a great deal about the climate yet, and b) CO2 is almost certainly not the primary driver of climate change.

    I never generally come on sites like this as i dislike getting covered in mud, but i’m having a slow day and some of what i read here is breathtakingly either poorly informed or very selectively informed. Congratulations to mugwump for sticking it out so long.

    Comment by Barelysane — 19 Sep 2008 @ 8:34 AM

  439. “In particular, we are on the cusp of a genetic engineering revolution that will rival the computer revolution in it’s breadth and growth rate.” – mugwump@435

    Right. And fifty years ago, we were going to have abundant fusion power, bases on the moon and Mars, superhuman artificial intelligences, household robots and personal helicars by now. Even your own example, the “genetic engineering revolution”, fails to make your point. Where are the radical advances in medicine that “gene therapy” was going to deliver?

    Comment by Nick Gotts — 19 Sep 2008 @ 9:40 AM

  440. Re Barelysane @438: “I’ve concluded that a) we don’t know a great deal about the climate yet, and b) CO2 is almost certainly not the primary driver of climate change.”

    I’d say the name you choose to post under is inaccurate.

    Comment by Jim Eager — 19 Sep 2008 @ 9:52 AM

  441. RE #435:

    Response: So tackling economic crises on the basis of wildly uncertain information is at odds with common sense too?

    I really don’t think you want to go there gavin.
    Or if you do, fine by me.

    When climate sensitivity and its consequences, and the consequences of acting to drastically cut CO2 emissions, are as well understood as the current economic crisis and its consequences, then I will agree to act.

    [Response: Glad to hear that. Perhaps you'd like to point me to the economists' consensus reports over the last 15 years that predicted the sub-prime mortgage debacle and provided projections of what would happen under 'business-as-usual'? Or a paper from 1988 predicting current economic trends to within 10%? I find it amazing that you think we have a predictive capability in economics that is in any way comparable to that of climate physics on any time scale. Asian financial crisis? S&L debacle? Even China's spectacular growth? How far ahead were any of those things predicted? Compare to the 1992 paper predicting the impact of Pinatubo.... - gavin]

    Comment by mugwump — 19 Sep 2008 @ 9:55 AM

  442. Barelysane@ 438: “Having spent the last year or so trawling through the internet looking at various research and summaries to try to get a handle on the current state of knowledge. I’ve concluded that a) we don’t know a great deal about the climate yet, and b) CO2 is almost certainly not the primary driver of climate change.”

    Care to share your reasons for dismissing the scientific consensus that CO2 is the primary driver of (current) climate change? If not, why should anyone believe you when you assert that it’s “about the science” for you? I, for one, don’t.

    Comment by Nick Gotts — 19 Sep 2008 @ 10:10 AM

  443. David B. Benson: “Recycling paper is bad?”

    I’m not sure why Charles Hall thinks so, since I did not see it explained in his presentation, but I think production of the energy required to recycle (including transportation and process) generally releases a lot of CO2. Putting used paper in a landfill sequesters the carbon.

    Comment by Steve Reynolds — 19 Sep 2008 @ 10:29 AM

  444. RE: #420: Where to start?

    First: they ignore the effect of the oceans and heat uptake. Globally, the effect of oceans is to add a delay to temperature response to forcing. Regionally, where there is more landcover you expect a higher temperature response, and where there is less landcover (for example, the tropics) you expect a smaller temperature response.

    Second: High latitude response: I find it stunning that Christy doesn’t even acknowledge the well known (and predicted) property of the climate system that includes greater relative warming at high latitudes due to snow + ice albedo feedback as well as higher relative land surface area. It is surprising that no reviewers objected to this, either.

    Third: It is well known that tropical satellite data has the largest uncertainty, and yet this is what Christy et al. concentrate on?

    Fourth: “However, recent studies suggest that the value of g is much smaller”: then they pick out 4 citations, compared to about 10000 that suggest that in fact the value of g is not much smaller. And of those 4 citations, I saw Schwartz’ talk at AGU last year, and it was pretty much junk (see realclimate discussion on how his method of determining sensitivity can’t even get model sensitivity right, so how can it get real world sensitivity right?)

    I’m sure there are more, but that’s what I pick up in a quick read.

    Comment by Marcus — 19 Sep 2008 @ 10:40 AM

  445. Barelysane, #438.

    Why is it when it’s complicated, you deny it because it’s “too complex, you must be making some of that up” and when it’s simplified, it’s “the only thing you can learn is to ignore all BBC reporting”?

    How about you *don’t* come on these sites? You’ve nothing to add but noise.

    Comment by Mark — 19 Sep 2008 @ 11:36 AM

  446. Barelysane wrote: “I’ve concluded that a) we don’t know a great deal about the climate yet …”

    You are wrong.

    Barelysane wrote: “… and b) CO2 is almost certainly not the primary driver of climate change.”

    Wrongo again.

    Barelysane wrote: “some of what i read here is breathtakingly either poorly informed”

    Your own “conclusions” are breathtakingly wrong, indicating that you are a poor judge of whether others are “poorly informed” or not.

    Comment by SecularAnimist — 19 Sep 2008 @ 11:48 AM

  447. “a)

    we don’t know a great deal about the climate yet,”

    Radiative processes (incoming short wave / outgoing longwave), latent and sensible heat, Clausius Clapeyron, aerosol dimming, Hadley Cell Expansion, Pinatubo/Krakatoa/Tambora, greater northern hemisphere warming/meso & stratospheric cooling/increasing diurnal range – as predicted by theory. Passive Microwave temperature determination, CERES, MODIS, Royer Compilation, Climap, Julian Madden Oscillation, Hadley Cells, Ferrel Cells, Brewer Dobson Circulation, ENSO/QBO/NAO-AO, Polar Vortices, Antarctic Circumpolar current, Beaufort Gyre/Transpolar Drift, Pinatubo, Mars vs Venus vs The Earth, CFC-Ozone link…. That’s just off the top of my head. Oh yeah – Hansen 1988.

    We don’t know everything, but we know enough to know what we’re doing to the Earth is likely to be a problem.

    “b)

    CO2 is almost certainly not the primary driver of climate change.”

    Depends when you mean e.g.
    In the glacials: No, it was an amplifier.
    Now: Yes, it’s the primary driver of warming.*

    *Given the weight of evidence on this point I consider any probabilistic qualification to be superfluous.

    Comment by CobblyWorlds — 19 Sep 2008 @ 1:32 PM

  448. Barelysane,
    If you are so sure everyone is wrong and poorly informed on this site, why not use your real name? It doesn’t take a lot to argue against everything using no science and an alias. Most people who really understand and contribute to the science on this site have no trouble using their real names and credentials. They are willing to put who and what they are on the line to be ridiculed because they understand the science and believe their opinions have merit.
    You say “Having spent the last year or so trawling through the internet looking at various research and summaries to try to get a handle on the current state of knowledge.” Instead of trawling the internet, how about rreading some peer reviewed scientific literature and coming back with an informed opinion? Then you can have the courage to use your real name and enter into a meaningful and productive discussion about the science.

    Comment by Figen Mekik — 19 Sep 2008 @ 1:46 PM

  449. mugwump writes:

    There are few reports out there that another 1ºC of warming will be a disaster.

    I take it you don’t live in Australia.

    Global warming is already a disaster. And it’s going to get worse.

    Comment by Barton Paul Levenson — 19 Sep 2008 @ 3:31 PM

  450. Barelysane writes:

    CO2 is almost certainly not the primary driver of climate change.

    It is almost certainly the primary driver for the present global warming, however.

    Comment by Barton Paul Levenson — 19 Sep 2008 @ 3:33 PM

  451. Dear Barelysane,
    If ,as you’ve concluded,CO2 is almost certainly not the primary driver of the(recent) climate change,what do you think is?

    Comment by Lawrence Brown — 19 Sep 2008 @ 6:09 PM

  452. Mugwump, It is rather amazing that you think the technological progress of the last 150 years “just happened”. It happened because nations made a commitment to invest in basic scientific research and scientific human capital that could be brought to bear on problems. Such commitment is lacking now, especially in the US. The industrial hubs of science like Bell labs and Hughes Aircraft have been gutted, and new scientific projects like the Superconducting Supercollider are deemed frivolous. Where do you think the science will come from to underlie all your technological optimis? Unfortunately, the era of Vannevar Bush has been replaced by that of George Bush.

    Oracle of ReCAPTCHA: other decide

    Comment by Ray Ladbury — 20 Sep 2008 @ 2:59 PM

  453. The industrial hubs of science like Bell labs and Hughes Aircraft have been gutted

    I don’t know much about Hughes, but from first-hand experience I can tell you Bell Labs was well past its prime.

    As for the SSC – what new physics would it have uncovered? Probably very little.

    There’s tons of R&D going on today. Way more than ever before.

    Comment by Jonathan Baxter — 20 Sep 2008 @ 4:59 PM

  454. > what new physics would it have uncovered? Probably ….

    You can look this stuff up, rather than deny it, e.g.
    http://www.cbo.gov/ftpdocs/55xx/doc5546/doc11b-Part_2.pdf

    Comment by Hank Roberts — 20 Sep 2008 @ 6:16 PM

  455. Ray: “The industrial hubs of science like Bell labs and Hughes Aircraft have been gutted, and new scientific projects like the Superconducting Supercollider are deemed frivolous.”

    As a former long time Hughes employee (who coincidentally briefly worked on the Hughes proposal for some of the SSC instrumentation), I’d like to comment on that.

    I do think we had some special development capabilities at Hughes up through the 1980s, although they were mostly focused on military applications. Even there, the amount of money we could use for research (including generous DOD funding) could easily be overwhelmed by commercial interests. For example, we probably had some of the most advanced IC fabrication capability in the world in the 1970s. By the mid 1980s, commercial capability had advanced so far beyond what we could do, we had no choice but to outsource almost all IC fabrication to commercial foundries.

    My point is to disagree with your statement:
    “It happened because nations made a commitment to invest in basic scientific research and scientific human capital that could be brought to bear on problems.”

    In most cases, ‘scientific human capital can be brought to bear on problems’ better by markets. Of course, the problem that is addressed may not be what the government (military applications) or scientific elite (SSC) wanted. I do not see that as a disadvantage, though, especially in developing alternatives to fossil fuels. If a push in that direction is needed, a small to medium revenue neutral carbon tax would be very effective in providing market solutions.

    Comment by Steve Reynolds — 20 Sep 2008 @ 7:55 PM

  456. Jonathan Baxter@453 “There’s tons of R&D going on today. Way more than ever before.”

    Do you have a source for this claim?

    I have a memory of reading a piece in either Science or Nature, from a few years ago, claiming that spending on R&D had for a long time grown roughly with the square of GDP, but had (in the ’80s or ’90s?) stopped doing so, and was at the time of the article growing roughly in proportion to GDP. (If anyone can identify this reference, I’d be grateful!) That would still mean there’s more going on than ever before, but that does not imply useful results will arrive faster, or even at the same rate. Consider what’s happened with antibiotics – very few new ones are being discovered, despite great efforts on the part of pharmaceutical companies. More impressionistically, compare science and technology now with that in 1958, and 1908. I’d say there was considerably more change 1908-1958, than 1958-2008.

    Comment by Nick Gotts — 20 Sep 2008 @ 9:22 PM

  457. Mugwump,

    I think you are using a rather strained definition of fitting to describe GCMs as being fitted to the data. When we fit a statistical model we have a data set and a model with some some parameters which have unknown values. Our output is a set of fitted values and a loss function. We select the parameter estimates which minimize this loss function.

    This is not exactly a good description of what happens in climate modeling. As I understand it the output of each model is a function of the data used to fit it. There are various parameters used within the model which are determined before the model is fitted. The structure of the model is chosen for theoretical reasons and the parameter values are not estimated from the data used in the model, at least not as part of the model estimation process. The parameter values are inputs to the model not output. What we evaluate is the ensemble of results. Now if we get results that are consistent with the historical record and the structure of the model reflects physical reality then we can use the models to make predictions. But we look at the ensemble of predictions .

    Now if the predictions are inconsistent with past data then we say that the model is inadequate. Sometimes the inadequacies are too serious for the model to be of any use, sometimes it is usable but with some caveats. The response to model inadequacies is more a reworking than a tweaking.

    Adding parameters is not as easy as it is in a statistical model. To add a parameter you have to add or modify the processes in the model. You have to justify it in terms of the physics rather than in terms of the fits. These models are not black boxes in the way that a purely statistical model can be.

    If parameter values other than those used in the model give results that are more consistent with past data then one might check and re-estimate the parameters from whatever source they were obtained. One might also check the data. You check the data and the parameters as a result of failed predictions rather than substitute values estimated from the models. (Gavin correct me if I have misunderstood what goes on.)

    In statistical models overfitting works its mischief by incorporating noise from the data set into the model. To fit the training data set more accurately unnecessary terms are introduced which have a chance association with the values that the noise component of a more appropriate model would take for the training data set. That is we include in the model a component which is not repeated on the test data set and which hence leads to less accurate predictions on future data. It is harder to incorporate a process which is simply a reflection of noise into a model when all its components are there for specified physical reasons. There are no model components that are there purely in order to fit the data. To make claims of over fitting you have to say what unnecessary component was in the model and why its inclusion reflects circumstances which will not be repeated.

    All this being said, am I completely happy with the statistics being used by climate scientists? No. I have seen some errors now and then and I have seen cases where more advanced statistical techniques could have been used to advantage. I also think that it might be a good idea if some of the parameter values in the GCMs should be randomly determined from certain distributions at run time rather than have their point estimates used. Using only the point estimates could lead to some interactions not having their actual effects. Still if they are making predictions the models cannot be black boxes. The models have to be based on the physical processes.

    Comment by Lloyd Flack — 21 Sep 2008 @ 1:34 AM

  458. 1) There’s a lot of belief in the magic of R&D, but unfortunately, real professional R&D managers don’t believe in magic.

    One also has to be very careful with any numbers:

    when Microsoft employees fix bugs in Vista, that may well get counted as “R&D”. In particular, huge amounts of “R&D” spending is in software development [and nothing is wrong with that, it's very important], but people shouldn’t kid themselves about the kinds of invention produced.

    I’d guess that much more money is spent on fixing Vista bugs than was spent on the original discoveries of transistor, solar cell, and laser, put together.

    2) I worked 1973-1983 at Bell Laboratories, at its height 25,000 people and arguably the strongest industrial R&D lab … ever. Its record for innovation is rather good, in part because for a long time it knew what business it would be in, and would support brilliant scientists working on things that might or might not pay off in 20 years. Some did (transistor), some didn’t (magnetic bubble memories).

    Monopoly money was *nice*, yielding much of the bedrock for computing & communications; a lot of physics, math, statistics; transistors, solar cells, lasers, UNIX, and many other things people take for granted. Many of the key patents were required to be licensed at reasonable rates.

    But, we had a standard mantra:

    “Never schedule breakthroughs.”

    I got that from various (highly-regarded) bosses. I used to do “special projects” and trouble-shooting for one who was later Bell Labs President, who’d generated a lot of breakthroughs himself, and *he* said it.

    3) Rather, in R&D we did “progressive commitment”, with R+AR being 5-10% of the total staffing. Different people use different categories, but this is typical: it flows from numerous small efforts to a few very large ones:

    Research (R) ->
    Applied research (AR) ->
    Exploratory or Advanced Development (ED) ->
    Development (D) ->
    Deployment & scaleup

    One funds numerous small R efforts spread across many years. One selects promising results to turn into AR efforts. The most promising move on to ED and then D. You only do D with technologies you know work. The big money is in scaleup and deployment, and since the Bell System was 1M+ people, we actually cared about huge deployment more than almost anyone except the Federal government.

    You could never count on R/AR for anything in particular, or on any particular schedule, but you always kept an eye on them to grab anything interesting. [I've done or managed all of these except pure R.]

    People may have heard the Bell Labs people won some Nobel prizes, and they did, but that was a tiny fraction of the staff – most people were doing D. One of the Nobels was for something they were trying to get rid of, not something they were actually looking for :-) That’s how it works.

    This also means that if you want to have large-scale effects soon, you do it with technologies you have in hand *right now*, while managing the R&D flow properly.

    CA managed to keep the electricity/capita flat over the last 30 years, while the US as a whole rose 40-50%. This was just paying attention and making myriads of incremental improvements, not needing any magic technology.

    More R&D on energy efficiency of all sorts is quite welcome. The real concern is that the poor funding of R&AR over the last few decades has damaged the pipeline.

    Comment by John Mashey — 21 Sep 2008 @ 2:09 AM

  459. Of course, the problem that is addressed may not be what the government (military applications) or scientific elite (SSC) wanted. I do not see that as a disadvantage, though, especially in developing alternatives to fossil fuels. – Steve Reynolds

    You may not see it as a disadvantage, but it is one. Markets are inherently short-term, and ignore externalities unless forced to consider them. We need to be thinking decades ahead, and to have a coherent plan for the kind of infrastructure needed. Moreover, technological innovations need to be shared on an international level.

    Comment by Nick Gotts — 21 Sep 2008 @ 5:20 AM

  460. #453, Jonathan Baxter:

    The starting point for the deterioration of Bell Labs should be taken from the divestiture of AT&T in 1984, when the break-up of the telephone companies separated long-distance from local service, and equipment from services. The result was a push towards justifying Bell Labs research in terms of the bottom line, whereas there had been a very fruitful allowance for basic research previously. The research projects that led to the transistor, to the discovery of the 3-degree cosmic microwave background, to “optical molasses”, to the Shannon noise theorems (to name a few) were not motivated by bottom-line / product-development oriented funding.

    Although trained in physics, I was also not that positive about the funding of the SSC: too much buck for the bang. But the last time I checked into it a decade ago, there was remarkably little funding for solid-state/condensed-matter research compared with high-energy physics research – despite the fact that this is an area that has paid off again and again and again. In industry, R&D tends to be focused on the “D”.

    Comment by Neal J. King — 21 Sep 2008 @ 8:00 AM

  461. Steve Reynolds, The problem with market-driven research is that since it is application-focused, it rarely advances basic science. That is the sort of research that is lagging today–especially in the US and Asia. The technology of today is still mostly advancing on the basic science pre-1980. In the US, we used to understand that basic research would pay dividends down the road. That was the model of Vannevar Bush. The model allowed leaders in the scientific community, rather than politicians or “the market” to determine allocation. R&D at industrial labs also made provision for more basic research. The model worked amazingly well. Now in the US, we can’t even get students interested enough in science and engineering to pursue a PhD and have to fill our grad student roles with foreign students. Tried to hire a US citizen PhD lately? Markets work great to meet short-term needs. When it comes to long-term R&D and the advancement of basic science, they are myopic if not totally blind. BTW, I was at Hughes when GM drove it into the ground–and made lots of money doing it.

    Johnathan Baxter says:
    “As for the SSC – what new physics would it have uncovered? Probably very little.

    There’s tons of R&D going on today. Way more than ever before.”

    I think the first statement may be the dumbest I’ve read in a long time, but the second rivals it. I was referring to cutting edge basic research–you know the stuff that actually increases the total sum of human knowledge–not how to fit more songs on an ipod.

    Comment by Ray Ladbury — 21 Sep 2008 @ 10:14 AM

  462. Nick: “Markets are inherently short-term…”

    I disagree; companies that plant trees for harvest 25 or more years out have a good market for their stock. Toyota’s investment in the Prius took long (or at least medium) term thinking. That investment probably had to start in a fairly major way nearly 20 years ago (first Prius went on sale in 1997), and is probably just now starting to pay off.

    Nick: “[Markets] ignore externalities unless forced to consider them. We need to be thinking decades ahead, and to have a coherent plan…”

    My comment addressed externalities (revenue neutral carbon tax). I agree about thinking decades ahead; I just don’t want a _government plan_. Politicians are doing very well if they can make honest decisions, don’t ask them to make good technical decisions.

    Comment by Steve Reynolds — 21 Sep 2008 @ 10:54 AM

  463. The general public might be forgiven for believing all the hype from techno-fix enthusiasts that genetically engineered crops are going to feed the world. The record so far ?

    http://www.signonsandiego.com/uniontrib/20080618/news_lz1e18gurian.html

    Comment by CL — 21 Sep 2008 @ 11:26 AM

  464. Steve,
    Your point about trees is a valid one, but not relevant to the development and deployment of new technologies on a massive scale. A revenue neutral carbon tax would be better than nothing, but not much better. We need a coherent (but flexible) plan because, depending on the approach taken, quite different investment priorities will be needed, and these will need to be applied across company, industry and national boundaries. Governments did not leave winning WW2 to the market for precisely these reasons – they took technical decisions based on the best available expertise, and directed companies in key sectors as to what they were to make and how much they could charge for it. If they hadn’t, the Axis would probably have won. Our survival as a civilisation is just as much at stake now as it was then. I’ve noticed many Americans have a knee-jerk hostility to government planning, but in fact governments do it all the time – how do you think road networks get built?

    Comment by Nick Gotts — 21 Sep 2008 @ 12:03 PM

  465. Steve, faith in free markets untrammeled by regulation is akin to faith in the open range unlimited by fences. Stampedes happen. Arguing this is being done on plenty of blogs, this isn’t a useful place to do it.
    Try http://calculatedrisk.blogspot.com/ for that kind of discussion, eh?

    Just speaking as an interested reader — there’s a reason for keeping focus here.

    Comment by Hank Roberts — 21 Sep 2008 @ 12:52 PM

  466. Steve #462.

    You can disagree with that statement from Nick.

    This doesn’t mean you are right.

    Electric cars have been on the cards since the sixties for the average road user. For specific needs, since the fifties or even earlier. The research goes back to the turn of the last century. The aim of that research at the change from 19th to 20th century wasn’t “to make electric cars” however.

    As to the statement about short termism, what about the current outsourcing mantra. I forget who said it, but I think it was at Ford. When you’ve moved your work abroad or replaced people with robots, who is going to buy your cars?

    But that doesn’t stop outsourcing or RIF happening. Why? Because “markets” don’t care if the ten-year forecast is bad. This year. This quarter matter. Look at Yahoo and the demand from one investor to sell Yahoo to Microsoft. Never mind that the sum of two competing members in an industry is never to the betterment of the company being bought. Never mind that MS would want only the names and accounts Yahoo had and go hang the rest of the business. It only mattered that MS would pay a bonus on shares that would in a year be worthless. But when you can sell those shares at the uptick..?

    Comment by Mark — 21 Sep 2008 @ 4:22 PM

  467. Nick (459), marketing is inherently short term, but research and development as carried out commercially by some is usually long term, though measured in years up to a decade or so, but probably not “decades” (plural) as you say.

    Comment by Rod B — 21 Sep 2008 @ 5:21 PM

  468. Ray: “The problem with market-driven research is that since it is application-focused, it rarely advances basic science. That is the sort of research that is lagging today–especially in the US and Asia.”

    It would be interesting to try to quantify that. Are Europeans getting most of the Nobel prizes now?

    I’m curious what specific basic research you think needs more funding (not just more for all)?

    Comment by Steve Reynolds — 21 Sep 2008 @ 8:09 PM

  469. Nick: “We need a coherent (but flexible) plan because, depending on the approach taken, quite different investment priorities will be needed, and these will need to be applied across company, industry and national boundaries. Governments did not leave winning WW2 to the market for precisely these reasons – they took technical decisions based on the best available expertise, and directed companies in key sectors…”

    I agree that can work for a limited time (4 years for the US in WW2) when everyone is dedicated to the cause, and cost is no object. Efficiency may not be high, but dedication helps to correct the worst problems and abuses. NASA’s Apollo program is another example.

    But when the time frame lengthens, bureaucracy grows (NASA post Apollo), and politicians develop ways to profit from power, then the dedication of a few individuals is not enough to make much progress toward long term goals.

    Hank, if we are going to talk about how to implement solutions at all here, then I think it is not OT to consider how to make them successful, which includes preventing the whole thing from turning into another boondoggle.

    Comment by Steve Reynolds — 21 Sep 2008 @ 8:38 PM

  470. Lloyd Flack,

    Thanks for a clear and interesting discussion on modelling. (#457.)

    Comment by Kevin McKinney — 21 Sep 2008 @ 9:50 PM

  471. True, Rod, #467, however, the first thing to get cut for almost all companies when times are hard is R&D.

    Last being C*O renumeration.

    Sigh.

    Comment by Mark — 22 Sep 2008 @ 3:05 AM

  472. To all those who commented on my post.

    1. Your comments are the reason i don’t usually frequent boards like this (see my comment on mud).
    2. I would hardly say there is consensus out there.
    3. I would say the current likely candidate for primary driver is solar activity in concert with orbital varitions. Not to say CO2 doesn’t have an effect, but it’s been significantly hyped.
    4. To whoever it was that mentioned reading peer reviewed journals. You may have not noticed, but the internet is awash with information (inc, peer reviewed data), not just opinions.
    5. What’s the advantage of using my real name? I could use Steve Michell and you wouldn’t have a clue if it was real or not.

    bye now

    Comment by Barelysane — 22 Sep 2008 @ 4:39 AM

  473. Barelysane,

    1. Your comments are the reason i don’t usually frequent boards like this (see my comment on mud).
    My, you are a fragile flower, aren’t you?

    2. I would hardly say there is consensus out there.
    There is among relevant experts.

    3. I would say the current likely candidate for primary driver is solar activity in concert with orbital varitions. Not to say CO2 doesn’t have an effect, but it’s been significantly hyped.
    There has not been significant change in either solar activity or the Earth’s orbit over the past 50 years. Hence they canot be responsible for the rapid warming over that period.

    Comment by Nick Gotts — 22 Sep 2008 @ 7:47 AM

  474. Nick

    1. Fragile no, easily annoyed yes.
    2. At the risk of sounding like a pantomime, oh no there isn’t. Though admittedly this depends on your definition of ‘relevant expert’. I’m sure you’re aware of the 32,000 plus scientists who would disagree with there being a consensus on AGW.
    3. That’s just plain wrong. I’m not going to quote anything, there’s a wealth of info at your fingertips if you want to go looking.

    [Response: Oh please. 32000 chiropractors and dentists define the scientific consensus on AGW? Hardly.. You can sensibly argue about that means, what is covered and what is not. But declaring it doesn't exist is denial 101. - gavin]

    Comment by Barelysane — 22 Sep 2008 @ 8:24 AM

  475. Barelysane,
    Smart people don’t evade, they discuss with facts and scientific hypotheses. Otherwise you are just calling us names. And i didn’t ask you to just give any name, i asked you to be truthful bout your real name.

    Comment by Figen Mekik — 22 Sep 2008 @ 8:43 AM

  476. “To whoever it was that mentioned reading peer reviewed journals. You may have not noticed, but the internet is awash with information (inc, peer reviewed data), not just opinions.”

    It was me. Scientisits like to be specific, even about trivial things like this. No arm-waving. But to answer you, yes we know, we read those peer reviewed papers; lots of them both on and off the net. I am suggesting you do the same.

    Comment by Figen Mekik — 22 Sep 2008 @ 9:21 AM

  477. Gavin – Please follow to the website below, you might find it enlightening.
    http://www.petitionproject.org/gwdatabase/GWPP/Qualifications_Of_Signers.html

    [Response: Ok then, find me ten people who have expertise and peer-reviewed publications in climate science who've signed it. Just ten. And then explain to me why I should value their signature on this refried petition over anything that is in the literature. Because if you think that science is a democracy, you are woefully confused. The consensus is just an observation, not a reason for something to be true. - gavin]

    Comment by Barelysane — 22 Sep 2008 @ 9:44 AM

  478. You ask where I think research is going begging. First, if you look at the model promoted by Vannevar Bush, the whole point was to fund research across the board, precisely because you can’t pick the big winners in advance. If a field is progressing rapidly, it will attract talent, even from folks nominally outside that field–viz. the efforts of Szilard in biology post WWII.

    Having said that, there are some areas that have really been hurt by recent cuts. One of them is Earth science. It is apalling that we know the topology of Mars and Venus better than we do that of Earth. NASA specifically has taken some very hard hits in Earth Science. In part, this is to pay for the new “exploration” effort, but the Administration has de-emphasized Earth science even in the Agency’s mission statement (btw, if you haven’t read “Home Planet” by Tom Bodett, you should–a stitch). As to your characterization of NASA’s dysfunctionality, it is not merely the bureaucratic culture, but the feuding between two cultures of science and exploration. That was there from the inception.

    Materials science is another area where progress has been stifled by lack of funding.

    More to the point, when I was a Grad Student, I had no trouble making ends meet on a graduate stipend, while I know of very few who could do so now. The cost of technical books has gone through the roof, as have tuition and living expenses. I came out of grad school $20000 to the good. It is rare for a student to emerge now with an science or engineering degree without a significant amount of debt. This changes the calculus for bright students–if they’re smart enough to pass calculus, they’re smart enough to see that a PhD in science or engineering won’t pay as much as an MBA. Now you’re going to tell me that this is just efficient use of talent. However, the geniuses on Wall Street seem to be too busy wrecking the economy to invent the energy resources of the future or the technology we will need to deal with climate change. And in China and India they are minting new engineers at a scary rate, but they don’t seem to be too interested in those problems just yet either. All in all, in this respect, the command economies seem to be kicking the butts of market economies when it comes to infrastructure–be it physical or intellectual.

    You mention the metric of Nobel Prizes. Europe has made some significant inroads there, but the trend is even more evident in patents granted. Europe overtook the US for the first time a few years ago.

    Comment by Ray Ladbury — 22 Sep 2008 @ 10:00 AM

  479. # 474

    In addition to Gavin’s link on the consensus, you can see my sample of the 32,000 “chiropractors and dentists” (although I think gavin has given them too much credit). The very first person in the “A column” has research interests including intelligent design! Come on!

    Comment by Chris Colose — 22 Sep 2008 @ 11:11 AM

  480. But when the time frame lengthens, bureaucracy grows (NASA post Apollo), and politicians develop ways to profit from power, then the dedication of a few individuals is not enough to make much progress toward long term goals. – Steve Reynolds

    I’d agree that’s a serious problem; I think we need planning institutions designed with the faults of bureaucracy in mind – specifically, they need to be as open to critical scrutiny and broad participation as possible. In this regard, we have a great potential advantage over WW2 and even Apollo: no need to keep secrets from enemies or rivals.

    Comment by Nick Gotts — 22 Sep 2008 @ 12:29 PM

  481. Re Barelysane @474: “I’m sure you’re aware of the 32,000 plus scientists who would disagree with there being a consensus on AGW. …
    … I’m not going to quote anything, there’s a wealth of info at your fingertips if you want to go looking.”

    That’s it? That’s all you’ve got? A fraudulent petition signed by 32,000 people, almost none of whom have any qualifications what so ever to asses the current science, and your opinion, unsubstantiated by even a single reference, that the current primary driver is solar activity in concert with orbital variations?

    And you wonder why no one here takes you seriously?

    Prescient Captia: to Unmasked

    Comment by Jim Eager — 22 Sep 2008 @ 2:49 PM

  482. Chuckle.

    http://moregrumbinescience.blogspot.com/2008/07/petitioning-on-climate-part-1.html

    “… As a general reader with no knowledge of the underlying science, this just looks very bad to me.
    * The project and people citing it want me to believe that there is serious, large scale, scientific opposition to the science on climate.
    But:
    * Their ‘big’ number is grossly padded by people who have not studied climate science nor worked in it.
    * It isn’t a ‘big’ number. The fields they are including are huge. …
    * …they don’t, on their list of signers, include what the field was for the PhD….
    … in part 2, I’ll take a look at how many AGU Fellows have signed. As an AGU member (ordinary member) I’ve received the mailing myself, so I’m sure that they have as well….
    _____________________________________

    ReCaptcha: Resale Good-bye

    Comment by Hank Roberts — 22 Sep 2008 @ 3:56 PM

  483. RE #441 gavin:

    Glad to hear that. Perhaps you’d like to point me to the economists’ consensus reports over the last 15 years that predicted the sub-prime mortage debacle and provided projections of what would happen under ‘business-as-usual’?

    Plenty of people predicted the sub-prime mortage mess before it happened. But as with most bubbles, everyone also thought the losses would be borne by someone else.

    Or a paper from 1988 predicting current economic trends to within 10%? I find it amazing that you think we have a predictive capability in economics that is in any way comparable to that of climate physics on any time scale.

    Show me a paper from 1988 predicting global temperatures plateauing for the past decade or so.

    There is economic “weather” and economic “climate”, just as there is ordinary weather and climate. Comparing apples to apples, it seems to me we have a better understanding of the economic climate than we do of the ordinary climate.

    [misspellings deliberate to get past the spam filter]

    Comment by mugwump — 22 Sep 2008 @ 4:14 PM

  484. RE #478:

    All in all, in this respect, the command economies seem to be kicking the butts of market economies when it comes to infrastructure–be it physical or intellectual.

    Do you really believe that Ray? How much of the world’s new technology and products are invented, designed or developed in those command economies?

    China is the world’s manufacturing base because of cheap labour and a strong entrepreneurial culture, but as yet they have had very little impact on the frontiers of science or technology.

    Comment by mugwump — 22 Sep 2008 @ 4:28 PM

  485. Barelysane,

    Please give your evidence of any trends in solar forcing. You see I thought orbital perturbations acted on a timescale of millennia I also thought that the components of solar output capable of explaining any temperature trend in the last fifty years had not shown any trend that could explain the global warming in that period. There are plenty of articles and links on this site dealing with these.

    So be specific, what trends are you taking of and what is the source of your information. No hand waving please, information derived from direct instrumental readings only.

    The fact is you are letting your politics influence your judgment on a scientific question. You believe that we are not affecting the climate because you don’t want to believe it and you are looking for reasons to believe what you want to believe. And in case you ask what about me, my politics are probably closer to Mugwump’s than they are to those of most of the posters and commenters here. The Universe sometimes requires that I support actions that I would rather I didn’t have to support.

    Comment by Lloyd Flack — 22 Sep 2008 @ 4:41 PM

  486. Is the moderation slipping a bit?

    How much longer do we have to be afflicted with Mugs recycled economic theory? Do we really understand economics better than physics? Give me a break.

    [edit]

    Enough is enough. Gavin, your patience is that of Job. These two are sucking up bandwidth as I am in responding to them.

    And for Mugs who is so sensitive to “deliberately dismissive diminutives”; I am deliberately dismissive of thought and expression, diminuitve in its content. BTW I’m still on the edge of my chair waiting for Mugs analysis of linear/nonlinear climate sensitivity or whatever.

    Pauly Pops and proud of it

    Comment by Paul Middents — 22 Sep 2008 @ 8:27 PM

  487. Re #477 response: “Climate science” is a broad enough category and many of the petition signatures old enough that I bet I can find ten. Unless, that is, you’re going to unreasonably insist that they be presently among the living and that the signatures be otherwise verified. There are even a few we can know are legit without checking, e.g. Dick “Contrarian Spice” Lindzen.

    Comment by Steve Bloom — 23 Sep 2008 @ 1:37 AM

  488. Rod B #426:

    Also, some truth in the validation of circulation models; but that ignores the great lengths that climatologists go to disconnect weather from climate models; actually not great lengths per se but the reasons for the great lengths — simply that they are different.

    Eh… you’re trying to say that they are just fine and dandy until someone has the temerity to compute an ensemble average over them… right?

    Sure. Computing averages is risky business ;-)

    Comment by Martin Vermeer — 23 Sep 2008 @ 4:25 AM

  489. Nick writes, correctly:

    There has not been significant change in either solar activity or the Earth’s orbit over the past 50 years. Hence they canot be responsible for the rapid warming over that period.

    To which barelysane replies:

    That’s just plain wrong. I’m not going to quote anything, there’s a wealth of info at your fingertips if you want to go looking.

    Nick is right and you are wrong. Take a look:

    http://members.aol.com/bpl1960/LeanTSI.html

    Comment by Barton Paul Levenson — 23 Sep 2008 @ 5:01 AM

  490. mugwump writes:

    Show me a paper from 1988 predicting global temperatures plateauing for the past decade or so.

    They haven’t:

    http://members.aol.com/bpl1960/Ball.html

    http://members.aol.com/bpl1960/Reber.html

    Comment by Barton Paul Levenson — 23 Sep 2008 @ 5:26 AM

  491. mugwump:

    You seem to be assuming that the sensitivity will shake out at 2 degrees per doubling rather than 3 or 4.5.

    That’s my guess, but not an assumption. All I am saying is we should get better bounds on the sensitivity before making drastic changes to our economy.

    A bit like saying we’ll toss a coin six times and bet the planet that we’ll only get 0, 1, or 2 heads.

    Comment by Chris O'Neill — 23 Sep 2008 @ 6:47 AM

  492. Marcus,

    Re:Christy & Douglass I think you pretty much nailed it. They seem to have done their calculations based on just the tropics, having rejected the Global, Northern and Southern extratropic anomalies because the Northern extratropics show more rapid warming than the tropics or the globe

    “However, it is noted that NoExtropics is 2 times that of the global and 4 times that of the Tropics. Thus one concludes that the climate forcing in the NoExtropics includes more than CO2 forcing. …”

    whereas everywhere else its pure CO2 and nothing else?

    “The global values, however, are not suitable to analyze for that signal because they contains effects from the NoExtropic latitude band which were not consistent with the assumption of how Earth’s temperature will respond to CO2 forcing. ”

    They then conclude that as the tropical warming trend is approx the same as the theoretical ‘no-feedback’ warming from CO2 in the tropics then the feedback term g must be near unity and that this conclusion is contrary to the IPCC [2007] statement: “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.”

    Now I have precisely zip credentials in climate science but surely the glaring error is to assume that a globally uniform forcing from well-mixed CO2 should produce a uniform temperature change? Is the more rapid warming in the North not an expected consequence of the greater proportion of land, with its lower heat capacity, than the mainly oceanic South, rather than evidence that other forcings are at work?

    As with Marcus, I could go on, but this flaw seems to me sufficient to dismiss the paper and it conclusions, or have I missed something?

    [Response: Nope. You have it exactly right. - gavin]

    Comment by John Philips — 23 Sep 2008 @ 7:52 AM

  493. # 491 Chris O’Neill,
    I suspect it’s more likely to be within 0.5 degrees either way of 3.0. Different ways of estimating the sensitivity are coming up with about the same range. Combining these results in a meta-analysis should reduce the range somewhat. One of the areas where climate scientists are not taking advantage of up-to-date statistical methods.

    We will not be able to prevent all the coming harm from global warming no matter what we do. And of course, efforts to mitigate it will eventually run into diminishing returns. The question is at that point are the benefits of reducing greenhouse gases less than the cost of further reductions? We are a long way from that point. I think we can and will need to take some pretty drastic actions to slow down global warming. But much of it will be a lot of little things.

    Complicating this is the fact that most of the costs will be borne by future generations. What time discount do we apply to future damage? We have to apply some discount but how much? But since this is a cumulative problem the more we delay action the greater the cost. To delay action is an attempt to maximize the chance of the best possible outcome. But if the best possible outcome is unlikely then this is usually not a wise choice. It is better idea to minimize our losses in the more likely outcomes.

    Comment by Lloyd Flack — 23 Sep 2008 @ 8:17 AM

  494. Mugwump, re: scientific innovation and command economies vs. America.

    When was the last semiconductor fab built on the N. American continent? Have you tried to find a US-born PhD to fill a technical position lately? I have, and it is not an easy thing to do. China and India dwarf our output of scientists and engineers–hell even from our own grad schools. If you look at the US scientific community, it is getting increasingly long of tooth (myself included) and there isn’t anybody coming along to take our place. I suspect that in the future we’ll have nothing but MBAs and nobody for them to manage. The future ain’t bright.

    Comment by Ray Ladbury — 23 Sep 2008 @ 9:04 AM

  495. Lloyd Flack:

    I suspect it’s more likely to be within 0.5 degrees either way of 3.0. Different ways of estimating the sensitivity are coming up with about the same range.

    Yeah but there are people like mugwump who think that if there is some chance that sensitivity is less than or equal to 2.0 then we should act on the assumption that sensitivity will turn out to be 2.0. Somewhat like if there is some chance we will win a spin of roul-ette then we should assume that we will actually win the spin. Maybe mugwump likes to play “let’s bet the planet”.

    Comment by Chris O'Neill — 23 Sep 2008 @ 12:04 PM

  496. Ray: “…the metric of Nobel Prizes. Europe has made some significant inroads there, but the trend is even more evident in patents granted. Europe overtook the US for the first time a few years ago.”

    But you are concerned about long term basic research not being funded by market driven corporations. Patents do not seem a good metric for that. I think patents are more indicative of the size of the organization doing the research – not much small business in Europe.

    Ray: “When was the last semiconductor fab built on the N. American continent? Have you tried to find a US-born PhD to fill a technical position lately?”

    I think Intel built one in New Mexico in the last few years, but that has little to do with scientific innovation. It is almost all taxes/incentives and labor cost.

    I have tried to find a US-born workers to fill technical positions recently. You are correct that PhDs are rare, but there are plenty of BS and MS available. I think they have recognized that they can learn just as fast on-the-job and be well paid for it.

    Comment by Steve Reynolds — 23 Sep 2008 @ 1:45 PM

  497. You are correct that PhDs are rare, but there are plenty of BS and MS available.

    I agree. And MS is more desirable than PhD for most positions outside a pure research lab.

    PhD is by-and-large an academic qualification, pursued by those interested in basic research. Although we do a fair bit of R&D here, I prefer to hire good quality Masters graduates for that since they are generally more willing to do the drudge work than their higher-qualified brethren.

    Comment by mugwump — 23 Sep 2008 @ 1:55 PM

  498. Chris, how about we assume the sensitivity will be between 2 and 4.5?

    Y’know, like the actual models do?

    And these models show that the expected losses from “Business as usual” are orders of magnitude greater than the cost of “stop here. No more” change in economy.

    Expected cost = integral (f(s)*p(s)) ds (s=0, s=inf)

    where f(s) is the cost of damages from a sensitivity of s
    p(s) is the probability of a sensitivity of s

    And NOT (as mugwump wants it to be)

    integral (p(s)) ds (s=0, s=inf)

    as evidenced by his “astounding” discovery that the probability of sensitivity being 1 is greater than it being 4.5. Or whatever he was wittering on about.

    (dang, edit that last one, but I can’t. Equation should change)

    Comment by Mark — 23 Sep 2008 @ 2:09 PM

  499. mugwump, since the discussion has strayed in economics and politics, suppose you WERE convinced that AGW was unequivocally the cause of current warming and that the projected effects did indeed pose a serious risk to economy and even the well-being of descendents. What would be your political and economic strategy? Most proposed solutions are regarded as left-wing. I would like to see what the right-wing approach is.

    Comment by Phil Scadden — 23 Sep 2008 @ 8:21 PM

  500. Mark: “And these models show that the expected losses from “Business as usual” are orders of magnitude greater than the cost of “stop here.”

    Do you have a peer reviewed reference for that? There might be some, but they are just as far out of the economic mainstream as the papers viewed negatively here are outside the climate science mainstream.

    Comment by Steve Reynolds — 23 Sep 2008 @ 8:22 PM

  501. Mugwump and Steve Reynolds:

    See:

    http://www.aip.org/statistics/trends/highlite/ed/figure14.htm

    http://www.aip.org/statistics/trends/highlite/ed/figure8.htm

    http://www.aip.org/statistics/trends/highlite/ed/figure9.htm

    Actually, patents is an excellent metric. It shows the level of applied R&D, and gives a good idea where future economic growth is headed.

    Comment by Ray Ladbury — 23 Sep 2008 @ 9:10 PM

  502. Ray, talking about European scientific innovation, this seems pretty inefficient:

    [Repair time] would go past the shutdown already scheduled for CERN’s facility to begin its winter break. It usually shuts in mid-November and resumes at the end of March or early April, to avoid its heavy use of electricity during the winter months when Europe has high demand for power.

    Comment by Steve Reynolds — 23 Sep 2008 @ 9:53 PM

  503. Steve, Actually, CERN lost more time than this due to failures of US magnets. They are opening up the biggest machine ever made–there are bound to be some glitches. And you will note that they have a supercollider, while ours is still a hole in the ground in TX, despite over a billion dollars of expenditures. Now THAT is inefficient.

    A quick look at how the respective currencies are doing (and have done since introduction of the Euro) provides another indication.

    Comment by Ray Ladbury — 24 Sep 2008 @ 7:40 AM

  504. Further on innovation and patents, though not directly related to Ray’s point, an interesting item on a forthcoming report claiming current IP systems are stifling rather than promoting innovation (the lack of practical results from recent advances in genetics is used as an example).

    http://news.bbc.co.uk/2/hi/science/nature/7632318.stm

    Whatever you think of the particular claim, it reinforces Ray’s point that innovation doesn’t just happen – you have to get the supporting institutional systems right.

    Comment by Nick Gotts — 24 Sep 2008 @ 7:47 AM

  505. Ray, talking about European scientific innovation, this seems pretty inefficient…

    Saving money is suddenly inefficient? They have a choice as to when to run the thing – when electricity is cheap, or when electricity is expensive. Choosing to only run it when electricity is cheap says nothing about the efficiency of European science and engineering.

    Comment by dhogaza — 24 Sep 2008 @ 9:47 AM

  506. > inefficient

    Tasty bait, nicely jiggled

    Comment by Hank Roberts — 24 Sep 2008 @ 1:01 PM

  507. And further to #505.

    The laws of physics will not change because there’s a new year.

    Comment by Mark — 24 Sep 2008 @ 1:38 PM

  508. Ray: “They are opening up the biggest machine ever made–there are bound to be some glitches.”

    I agree, and did not mean to criticize that. But I expect the capital cost of ‘the biggest machine ever made’ is pretty high, so the efficiency of only running it 1/2 the year because of electricity costs seems pretty questionable. Either it should have been built to be more energy efficient, or located in a place with less expensive energy.

    Comment by Steve Reynolds — 24 Sep 2008 @ 2:04 PM

  509. You can look it up.
    http://www.spectrum.ieee.org/sep08/6690

    —–excerpt follows——-

    … CERN’s estimated annual electricity consumption could approach 1000 gigawatt-hours …. The massive LHC will account for about 60 percent; less than 15 percent of the total will go to mundane functions like keeping the lights on; and the other accelerators in the complex will account for the rest. A big part of the consumption is the hundreds of enormous superconducting magnets, … cryogenically cooled to temperatures between 1.8 and 4.5 kelvins (colder than outer space). If the temperature creeps even a fraction of a kelvin above that, the magnets stop working and lose control of the beam. An uncontrolled beam can melt 500 kilograms of copper in an instant, causing serious damage and halting the experiment for months. So it is crucial to keep power flowing into CERN at all times.

    But CERN does not generate any of its own power, so how does it ensure an unbroken supply of electricity?

    The LHC’s location enables a unique power procurement system: power comes in from both France and Switzerland. CERN has an agreement with French supplier Électricité de France (EDF) that guarantees a source of reliable, affordable electricity, with one caveat: for 22 days a year during the winter, power costs become prohibitive. (During that time, all the experiments at CERN are shut down.) The contract stipulates that the accelerators will operate mainly from spring to fall, when the public strain on the electrical grid is low. The agreement also means that CERN must reduce its electricity consumption on demand or pay a whopping fine.

    But what if EDF’s system fails? Because the results of a power outage would be so disastrous, CERN also has a number of backup plans….

    Comment by Hank Roberts — 24 Sep 2008 @ 3:03 PM

  510. Either it should have been built to be more energy efficient…

    Dude, if you can figure out how to generate the magnetic fields required more efficiently than by using superconducting magnets, I’m sure they want to hear from you, just before you hear from the Nobel Committee.

    Comment by dhogaza — 24 Sep 2008 @ 4:00 PM

  511. Steve,
    Actually it’s a pretty efficient way of running things–you take data half the year and analyze it the other half at your home institution. Remember this is a global institution–the US, Japan et al. are paying to play. Even running half the year, they will have many, many terabytes of data to analyze. Dhogoza is right–the superconducting magnets are about the acme for a cyclotron, and you have to consider that energy is not the only consideration. Each experiment is like a small city, and you have to supply the folks there with the wherewithal to live and work.

    Comment by Ray Ladbury — 24 Sep 2008 @ 5:25 PM

  512. Please consider the following question………….

    CAN WE EXAMINE THE NEED TO SAVE EARTH’S ECOLOGY AS WELL AS THE MANMADE ECONOMY?

    In light of the increasing number of emergent and convergent, human-driven challenges that appear before the family of humanity on the far horizon, I believe it is vital for the blogging community to come together and, if only for a few moments, “get real” about what our species is doing, here and now, in these early years of Century XXI, to extirpate biodiversity, degrade the environment, dissipate Earth’s resources and threaten the very existence of life as we know it.

    Once the economy has been bailed out, I would like the self-proclaimed Masters of the Universe among us, the ones with hundred of millions of dollars in their priviate bank accounts, who are so adamant and urgent in their appeals to save the economy, to turn their attention, energy and vast wealth to the task of saving Earth and its environs from ruination.

    After all, what is the point of choosing to save the economy now if that choice means we could inadvertently ravage the Earth, upon which any manmade construction, even the colossal global economy, depends for its existence?

    What kind of economy can function without adequate resources and ecosystem services only the Earth provides?

    Steven Earl Salmony
    AWAREness Campaign on The Human Population, established 2001
    http://sustainabilitysoutheast.org/index.php

    Comment by Steve Salmony — 24 Sep 2008 @ 6:11 PM

  513. Ray: “Actually it’s a pretty efficient way of running things–you take data half the year and analyze it the other half at your home institution. Remember this is a global institution–the US, Japan et al. are paying to play. Even running half the year, they will have many, many terabytes of data to analyze.”

    Yes, but if more than one group is doing experiments, one can run while the other analyzes.

    Ray: “Dhogoza is right–the superconducting magnets are about the acme for a cyclotron…”

    Sure, they use no energy directly to maintain the field, so I assume cooling requires most of the power. If they are spending $100 million a year on electricity, did they choose sufficiently efficient insulation and cooling systems?

    Mark: “The laws of physics will not change because there’s a new year.”

    That argument could be used to say CERN should wait until the AGW problem is solved before they use all that energy.

    Comment by Steve Reynolds — 24 Sep 2008 @ 8:13 PM

  514. Steve Reynolds suggests a new policy for LHC: “Yes, but if more than one group is doing experiments, one can run while the other analyzes.”

    And I’ll let you tell the group that gets to go second–OK? Actually that’s not practical for a whole bunch of reasons. The detectors are pretty much permanent–they’re the size of a large building and take about a decade to install. Only the builders of the detectors know how to use them, in general.

    As to efficiency, there are a whole lot of reasons why you don’t want to be inefficient. Should you lose cooling, you go from superconducting (zero resistance, very high current) to conducting (finite resistance, same current). This happened one time at Michigan State and deafened just about everyone in the building. Very impressive. Actually, they probably spend just about as much on Helium as on electricity. Believe it or not, Steve, they do think about things like this. It’s a pretty impressive undertaking–and you get a whole helluva lot more science out of it for the buck than you do with the space station (aka the orbiting pork barrel).

    Comment by Ray Ladbury — 24 Sep 2008 @ 9:17 PM

  515. > If they are spending $100 million a year on electricity,
    > did they choose sufficiently efficient insulation and cooling systems?

    Smells tasty, yes. Twitches as dragged by. Almost lifelike. Tempting ….
    Nah.

    Comment by Hank Roberts — 24 Sep 2008 @ 11:34 PM

  516. Ha!
    http://www.userfriendly.org/cartoons/archives/08sep/xuf011924.gif

    Comment by Hank Roberts — 24 Sep 2008 @ 11:36 PM

  517. Steve, considering where the LHC is located, I’d say their electricity likely comes from hydro and nuclear

    Comment by Philippe Chantreau — 25 Sep 2008 @ 1:03 AM

  518. I have it on good information that when the LHC was fired up the other day, a mini black hole was formed and the Earth was destroyed.

    Comment by Barton Paul Levenson — 25 Sep 2008 @ 6:45 AM

  519. Believe it or not, Steve, they do think about things like this.

    I’d even venture to suggest that they might even know more about the subject than Mr. Reynolds, [edit]

    Comment by dhogaza — 25 Sep 2008 @ 9:28 AM

  520. The UK Met Office recently released a report responding to climate change scpetics who think that global warming stopped in 1998. They state “Global warming does not mean that each year will be warmer than the last.” But as I understand things, it DOES mean that.

    Due to rising concentrations of greenhouse gases, every year more heat will be retained than leaves. But in recent years excess heat has been stored in oceans, not the air. Am I right? Can anyone confirm or deny?

    If I am right I think organisations like the Met Office should be more explicit that the world as a whole IS heating up every year, but the world’s air temperature follows a more haphazard upward trend.

    [Response: The met office is correct. We do not expect the global temperature to increase every year, year after year. There is plenty of variability (including, but not limited to, ENSO, NAO and PDO etc.). We expect the overall heat imbalance to be a little more steady, but there too, we do not anticipate an increase every single year - at least at the moment. - gavin]

    Comment by m lordy — 25 Sep 2008 @ 10:23 AM

  521. Barton,
    As Hank says, you can look this stuff up:
    http://www.hasthelhcdestroyedtheearth.com/

    ;-)

    Comment by Ray Ladbury — 25 Sep 2008 @ 11:28 AM

  522. There is an interesting discussion on this thread about the relative merits of basic and applied research. I used to think that “basic” research was important too, and served as the basis for applied research. But I’ve changed my mind after a few years in the scientific community, followed by a few years as an entrepreneur, and through a lot of readings on history and sociology of science. The Vannevar Bush model may appear to have been hugely successful, but in fact it was only successful for those “basic” scientists who were flooded with funding to do things that had no relevance whatsoever to industry or the public good. This has reinforced a certain caste of scientists, who have come to believe that they deserved a privileged status. This lasted for the entire duration of the cold war, and afterwards things started to get ugly. I started my scientific carreer in the early 80′s at the time when big industrial labs, like Bell and IBM, started shrinking and shrinking. And so did research budgets. And so, too, did the status of scientists in our society.

    Now we are slowly reverting to the model that made the success of the Western world: a closer interaction between “basic” and “applied” science, a link that the Bush model had more or less severed, while pretending to strengthen it. This is the model of science as “useful knowledge”, in the words of Joel Mokyr.

    But one of the reaction of the scientific caste to this has been to retreat to their ivory tower, and try to re-establish their privileged status by claiming that their useless science is indispensible. My own pet theory (which I grant myself the permission to have) is that the climate change issue is an illustration of this phenomenon. Here we have scientists running a blog to tell us, poor peasants, what we ought to think, and what science is all about, and how asking for a simple explanation is not something we should do. Let us just trust them, and everything will be fine.

    When I left the shielded world of academia to become an entrepreneur, and try to make fructify the science that I had contributed to build, I was faced with a real world where all those nice principles did not apply. Though I knew it, being confronted with it, and having to deal with it because it’s your livelihood that is at stake, makes it suddenly very real. In industry, things HAVE to work, no matter how and why. And what is a beautiful lab setup is NOT a good production setup. In a lab, you can repeat your experiment 100 times to get a good result. In industry, a yield below 90% will kill you.

    This post (written by a physicist like me) did and did not surprise me. A lot of scientists think like that. Computer models make you feel smart. With the click of a mouse, you get as many results as you want. The bigger the program, the smarter you feel. But when you get to the point where you can’t explain the results and just say “the model says it”, then you’re on a very dangerous track.

    The renowned physicist and former colleague of mine John Love once told me that if you can’t get an answer with a “back of the envelope” calculation, then the problem is not worth attacking. It’s something one should remember.

    My first encounter with a scientific problem was like that. I was faced with a problem that was intractable analytically, so I had to resort to a computer calculation. It was a coupled set of nonlinear differential equations. Something quite simple if compared to climate models, I admit. I wanted to see what would happen to spectral measurements under conditions of high laser intensity, when you saturate the absorption. At some point, I “discovered” a very interesting effect: that the Doppler broadening of the line would disappear at high intensity. But it was tricky to program and prone to instabilities, so I could never really be sure of my results: was it real or a numerical artefact?

    So I went back to the equations. Most published solutions would take them to the third order of perturbation. Nobody thought that the fifth order would make any significant change. But I went ahead and looked at the fifth order. There were a plethora of terms, but I found out that most of them would not contribute, but with a very simple criterion I could select the few that did. And then I could relate this to a diagrammatic approach (à la Feynman) that made it even easier to see the important terms. Suddenly, what was an intractable problem that could only be solved numerically had a very simple graphical solution!

    But then came my first encounter with this great feature of the scientific institution: the publishing system. Since I was not from Bell labs or MIT, my paper was rejected in Physical Review, and I had to publish it in the Canadian Journal of Physics, needless to say a much less prestigious journal with much smaller readership. Nevertheless, five years later, while at the poster session of a conference, I noticed a poster with experimental results that demonstrated the effect that I had predicted. It was very satisfying for me, because I thought that nobody would ever pay attention to my results! But the real satisfaction was to see that the effect was for real!

    But in the end, the numerical calculation alone would not have been enough. I could have been wrong. At the very least, had I not simplified the problem I would not have understood the result. “Explanation” is important. “Simple explanation” even better.

    So to tell people that “explanation” is not important, and that we can trust complex computer models, is, in my opinion, not a very good advice. To claim that this is how “scientists” work is pure hubris. I would be tempted to say that this is how less talented scientists work, but then I don’t believe that they are less talented. They just have not faced the real world.

    Comment by Francois Ouellette — 27 Sep 2008 @ 10:32 AM

  523. Francios, to say that “and that we can trust complex computer models, is, in my opinion, not a very good advice. To claim that this is how “scientists” work is pure hubris.” is bad advice and evidence of hubris.

    Some maths integrals cannot be solved symbolically and MUST be solved using discrete mathematics. Some systems cannot be predicted symbolically (Game Of Life, for example) and can only be explained AFTER simulation has shown what happens.

    To say that scientists don’t work that way indicates that YOU speak for all scientists and know how they ALL work.

    Now, rather than diss the current methods, will you come up with some CONSTRUCTIVE criticism?

    Comment by Mark — 27 Sep 2008 @ 1:50 PM

  524. The renowned physicist and former colleague of mine John Love once told me that if you can’t get an answer with a “back of the envelope” calculation, then the problem is not worth attacking. It’s something one should remember.

    The Four Color problem, and its solution, dismissed with the wave of a hand …

    Comment by dhogaza — 27 Sep 2008 @ 2:19 PM

  525. Makr, I thought this WAS constructive criticism. The bloggers here often pretend themselves to speak for the “scientific community”. They don’t speak for me that’s for sure, but then maybe I’m not a member of the “community”, and maybe I don’t want to be. Maybe I’m just someone who practiced science for a number of years and tried to make some valuable contributions (useful knowledge…). I don’t think ANYONE should speak in the name of all scientists, or should give lessons on what is or what isn’t science.

    I sure know that some problems cannot be solved analytically. I certainly don’t think that climate models are useless. But there is great danger in blindly trusting complex computer models, and there is great value in “back of the envelope” calculations. In any case, a back of the envelope calculation will give you an answer that lies well within the range of estimates of complex models who do not seem to have improved over the past 15 years, so that fact alone should tell you something. I simply oppose the view that we cannot or should not even ask for a simple and clear estimate. Like in the example I gave, the model should rather help you find what is important and what isn’t. The model helps you simplify the problem. I don’t see how this would be such a bad approach.

    Comment by Francois Ouellette — 27 Sep 2008 @ 5:55 PM

  526. This reminded me of many of the posts that I read here…

    Economic and planetary collapse: Is it a therapeutic issue?

    http://www.energybulletin.net/node/37091

    ‘April fireboats’

    Comment by CL — 27 Sep 2008 @ 6:19 PM

  527. #525

    Well maybe this:

    “and that we can trust complex computer models, is, in my opinion, not a very good advice. To claim that this is how “scientists” work is pure hubris.”

    wasn’t the right wording, else this:

    “I certainly don’t think that climate models are useless. But there is great danger in blindly trusting complex computer models,”

    is untrue.

    Who says anyway that the trust in complex computer models is blind?

    Comment by Mark — 27 Sep 2008 @ 6:53 PM

  528. Francis,
    The back of an envelope calculations of greenhouse gas effects do have the same sign as what vou get from complicated models and come within half an order of magnitude in size. Since such calculations leave out the feebacks they underestimate the effects by enough for them to be a poor guide to action. What they can de is provide a check that the complicated models are doing something plausiable.

    Comment by Lloyd Flack — 27 Sep 2008 @ 7:13 PM

  529. Francois Ouellette, Well, since ALL of the advances currently driving economic growth were developed under the Vannevar Bush model, I would contend that support for your argument is pretty weak. You speak of a “closer interaction” between basic and applied science–Good Lord, man, have you ever heard of the Manhattan Project? The Apollo Program? Fermilab or now the LHC? My own discipline is very applied and very practical. You are taking your own (quite limited) experience of science and generalizing from that to some very broad and incorrect conclusions. You know nothing of how science is done even in the majority of subfields of physics. You know even less of climate science. Where, for instance, do you get the idea that we accept the output of climate models “blindly”? Where do you get the idea that climate models have not improved? Do you even know any climate scientists? Then, how, pray tell do you conclude that they’ve retreated to the “Ivory Tower?” When you are so woefully misinformed on climate science, why should we trust your opinions about other matters (e.g. scientific funding).
    For your criticism to make it to “constructive,” you have a lot of ignorance to rectify. You are at the right place. Start reading where it says “START HERE”.

    Comment by Ray Ladbury — 27 Sep 2008 @ 8:10 PM

  530. #520 – Gavin’s response: “We expect the overall heat imbalance to be a little more steady, but there too, we do not anticipate an increase every single year – at least at the moment.”

    Could you please elaborate on the “at least at the moment” bit? Why is that? Would the imbalance be more steady with further rising temperatures or don’t we know enough about it yet?

    Comment by dagobert — 28 Sep 2008 @ 5:04 AM

  531. Francois Oulette posts:

    John Love once told me that if you can’t get an answer with a “back of the envelope” calculation, then the problem is not worth attacking.

    I agree. We shouldn’t have gone to the moon, or developed aviation, or heart pacemakers. Too complicated! Just stick with what you can do on the back of an envelope!

    Comment by Barton Paul Levenson — 28 Sep 2008 @ 5:40 AM

  532. In any case, a back of the envelope calculation will give you an answer that lies well within the range of estimates of complex models who do not seem to have improved over the past 15 years, so that fact alone should tell you something.

    Yes, outright falsehoods like this tell us that you don’t know what you’re talking about.

    Comment by dhogaza — 28 Sep 2008 @ 7:13 AM

  533. I think people were missing Steve Reynolds’ point: the LHC is locaterd in decadent social-ist Europe, so it’s by definition inefficient.

    Comment by Nick Gotts — 28 Sep 2008 @ 9:18 AM

  534. Barton Paul Levenson,

    Why such an overreaction?! I’m pretty sure that you could estimate what was needed to go to the moon with a back of the envelope calculation: how much weight you have to extract from the earth’s attraction, how much fuel will be needed etc. It’s not such a complicated problem. The engineers working on it mostly used slide rules as their primary tool. They certainly knew how to simplify a problem and get a quick answer. I never said that you could solve all the details of a problem with such a calculation, only that you should be able to quickly estimate the solution. The same with aviation. As a matter of fact, the first airplanes were developed much more on a “trial and error” basis than on actual theory. It took a long time for theory to catch up with practice. I think your choice of examples is rather poor.

    Having mentioned John Love, I should have added that he nevertheless went on to write a 400 page classic book on optical waveguide theory! It’s not like he doesn’t like to get into the details! The design of optical waveguides does rely on complex software, as does most design today. But from what I know, the use of such software is more of an art than a science. Put in the hands of non experts, it’s likely to give results that don’t make sense. I had a friend who ran a succesful consulting business because he knew how to use complex finite difference thermo-mechanical design software, and that’s because he had been a pioneer in developing such software for over 25 years.

    [edit]

    Comment by Francois Ouellette — 28 Sep 2008 @ 9:31 AM

  535. Ray Ladbury,

    Again why such wrath? Why am I being told that I have a “limited” experience with science and that I “know nothing” of how science is done? Are you the Ray Ladbury who writes for Physics Today? Or the one who studies damages to electronics? In any case, I don’t see that you would have a much more extensive experience than I.

    Why would I have such limited experience of science? I’ve been on grant committees that covered the entire Information Technology field in Canada, I’ve reviewed hundreds of proposals. I’ve done due diligence for many technology-based startups, I’ve worked in biotechnology, and I’ve published a few dozen papers and reviewed many hundreds. I’ve spent the last two years studying the history and sociology of science, and have probably read many more books on the subject than you have. Have you read Joel Mokyr? John Ziman? David Hull? Margaret Jacob? John Zammito? Eliot Freidson? Diane Paul? And that’s apart from the “classics” of Thomas Kuhn or Karl Popper, and even Bruno Latour.

    Since I got interested in the issue of climate change, I have probably read more than 300 papers on the subject, and that’s in the primary literature.

    So please, before you accuse me of knowing nothing and having limited experience, try to get to know more. I won’t accuse you of anything, because I don’t know you.

    It does seem like it is IMPOSSIBLE to make any sort of criticism of climate science without suffering these personal attacks. Have you ever wondered why? Don’t you think that this phenomenon is quite revealing in itself?

    Now if you want to have an intelligent discussion of the effects of the Vannevar Bush doctrine and how it evolved in the post-cold war era, I’m very open. But if you just want to attack my credentials because that’s the only response you can think of, well let us end that debate right now.

    Comment by Francois Ouellette — 28 Sep 2008 @ 9:58 AM

  536. Franscois Ouellette, Why the hostility? Hmm. How about the fact that you come on here with accusations that the entire climate science is composed of effete snobs who have “retreated to the Ivory Tower”? How about your claiming the climate scientists accept the output of their theories and models uncritically because it makes them “feel smart”? How about your claim that any knowledge that cannot be applied to turning a profit RIGHT NOW is “useless”? To me, these sentiments represent not just red flags waved to provoke controversy, but provound ignorance of how science works and how it eventually leads to technology. By your reckoning, I suppose Benjamin Franklin should have put away his Leyden Jars and kites and stuck to printing presses and stoves. By your criterion, Rutherford was wasting his time firing alpha paticles at foils of gold. We need to invest broadly in basic research precisely because we cannot pick the winners for tomorrow’s technologies. And you sure as hell are not going to populate the science and engineering schools of tomorrow if you preclude avenues for bright young people to satisfy their curiosity. Why would they go into science and engineering when they can tack another zero onto their paycheck by getting an MBA and still enjoy that Friday Afternoon Club party?
    Are you aware of the story Richard Rhodes tells in his book “Dark Sun” about the chemistry problem that confronted the Soviet nuclear weapon designers? They knew they couldn’t purify enough U-235 in time, so they knew (from their spies) that they had to go the Plutonium route. They also knew from their spies that the trans-uranics had a chemistry similar to the Rare Earths. Problem: The Politburo had forbidden study of the rare earths in the Workers’ Paradise as an extravagant luxury. The only reason they succeeded was because a chemist had risked the Gulag to satisfy his curiosity by doing experiments late at night. The Plutonium was produced using this chemist’s hurriedly copied notebooks.
    Yes, I’ve read lots of history and philosophy of science. However, I also think there’s something to be said for learning how science is done by actually doing it, don’t you?
    I’m afraid a detailed discussing the relative merits of Vannevar Bush would take us too far afield from the topic of this board. This is a board where non-climate scientists (myself included) come to learn about climate science. If you want to learn about that subject, this is a good place to start. And certainly your comments betray an ignorance of how climate science is actually done. If you took offense at my tone, I apologize. I tend to respond to agressive and provocative statements in kind.

    Comment by Ray Ladbury — 28 Sep 2008 @ 11:18 AM

  537. Ray, i don’t see how I was agressive. Provocative, maybe. What’s the point of having 524 comments if it’s only to have cheerleaders? If the whole point of the blog is to learn about climate science, why have comments at all?

    What I’ve learned of climate science was through the primary literature, not blogs. Am I wrong to believe that it is a more reliable source? Now you accuse me of saying all sorts of things that I did not say. I never said, for example, that the entire climate science community was composed of “effete snobs”!!! That’s a little too provocative! I never said that knowledge that cannot be applied right now is useless!!! Where did you find that? Please quote what I said, not how you interpret it.

    For what it’s worth, two companies currently make a living off of one of my patented inventions, which was made in the context of academia, so I guess I should know something about technology transfer, and going from “basic” to “industrial”. I started off as a physicist in nonlinear laser spectroscopy, and ended up an entrepreneur running a business with 100 employees, sold to a major American corporation. Yet I have published many papers of “basic” research. So why should I not be entitled to have an opinion on the subject? Does the fact that I disagree with Spencer Weart’s opinion on simple explanations mean that I’m a dangerous skeptic payed by the oil industry? Would that not be simplistic?

    So what have you read apart from the plutonium story? Have you read David McKenzie’s study of the history of intercontinental missiles? Have you read Loren Graham’s fine history of Soviet science? Or Naomi Oreskes’ history of Continental Drift, or Peter Bowler’s history of Darwinism, T.R. Reid’s history of the microchip, Robert Friedel’s history of western technology, or Joseph Needham’s history of Chinese Science and technology? How long a list do I need to supply to satisfy you that I’m not an ignorant bigot? How about your list?

    Now if you want to pursue the discussion, that’s fine. Let’s go. I’ll stay away from Vannevar Bush. Have you read Myanna Lahsen’s fine sociological study of climate modelers? All this is not to say that everything they do or say is wrong! But scientific commnities have their own dynamics. We hear a lot about “science”, and there is a lot of confusion between the knowledge itself, and the social institution that produces it. Many scientists brandish “science” as a rhetorical weapon, so that if someone attacks the institution, they reply that they attack the knowledge. But the scientific institution has many flaws, just like it has many virtues. I compare it with democracy: it’s a pretty flawed system, yet there is no better one. The important thing is that there is a lively public debate to uncover those flaws, and at least appear to correct or avoid them. No scientist I know will claim that peer review, for example, is a perfect process. Actually I know of no one who doesn’t think that it’s utterly biased and unfair. But if someone from outside the institution says the same thing, suddenly it’s the best thing since sliced bread!

    I have argued that climate science is very much representative of today’s pure academic science. I don’t know of any climate scientist who doesn’t work in an academic or quasi-academic setting, ie. mostly publicly funded. Is that an ivory tower? Certainly, to a great extent. That means that they are not accountable for anything they do or say. I don’t see how a modeler would be fired if her predictions turned out to be wrong. The same can be said of particle physicists, or evolutionary biologists. But those two latter fields do not make predictions on which important public policies are likely to be based. In fact the last time evolutionary biologists did that, it was quite a fiasco! So one should be cautious to rely on an institution that does not have a track record of accountability, or where accountability is not part of the culture (as it is in industry, for example), if you have to make important public policies. Wall Street CEO’S also do not have a culture of accountability. Whatever they do, they always have their golden parachute. I have seen plenty of that from very close.

    So feel free to comment on this humble opinion. I would appreciate if you would refrain from personal attacks this time.

    Comment by Francois Ouellette — 28 Sep 2008 @ 12:18 PM

  538. > Francois Ouellette Says:
    > 28 September 2008 at 9:31 AM
    > I’m pretty sure that you could estimate what was needed to go
    > to the moon with a back of the envelope calculation

    Jules Verne did the one-way calculation in about much:

    —————————-

    “… we shall have but 400,000 pounds of fulminating cotton; and since we can, without danger, compress 500 pounds of cotton into twenty-seven cubic feet, the whole quantity will not occupy a height of more than 180 feet within the bore of the Columbiad. In this way the shot will have more than 700 feet of bore to traverse under a force of 6,000,000,000 litres of gas before taking its flight toward the moon.”

    Barbicane and his bold colleagues, to whom nothing seemed impossible, had succeeding in solving the complex problems of projectile, cannon, and powder. Their plan was drawn up, and it only remained to put it into execution.

    “A mere matter of detail, a bagatelle,” said J. T. Maston.

    http://www.fourmilab.ch/etexts/www/etm/etm_chap9.html
    ————————————————-
    But you’ll notice nobody has tried his method.

    I recall Heinlein said it took him and his wife days of work — using up many pencils, on rolls of brown butcher paper — to get the elements right for “Destination Moon.”

    Show your work?

    People are still trying to get this kind of thing right, and writing software to help:
    http://www.physicsforums.com/showpost.php?p=249113&postcount=36

    Comment by Hank Roberts — 28 Sep 2008 @ 12:23 PM

  539. Dear Mr Ouellette #537,
    You certainly expressed the difficulties a lot of scientists and engineers, working as experimentalists in the industry, have with the assumptions, proofs and papers that get published about climate science in popular journals and in the internet.
    Many of us are humbled by nature, because even if we understand the underlying basic physics and chemistry of our production processes, nature might surprise us with unexpected root causes and results. The certainties in climate science certainly come as a surprise.
    However, one also has to accept that the climate science community came to a consensus how to evaluate the results of modeling. This evaluation in turn provides the evidence that leads to the conclusions of the IPCC. Moreover, one has to assume that the serious journals provide this insight.
    Unfortunately, the climate science community seems to be also very defensive, if someone asks questions to understand the reasoning.
    I experienced that it is immediately concluded that one does not understand the basic physics, which is usually not the case. You have to get used to it. However, this blog on realclimate is an excellent forum for very interesting discussions.

    Comment by Guenter Hess — 28 Sep 2008 @ 3:00 PM

  540. Francois, First, congratulations on your record of achievement. I mean that. I respect those who can turn their discoveries into useful technology. Now, maybe you can explain how these accomplishments give you special insight into climate science. You seem to be saying that your record of accomplishment should confer on you some special expertise when you refuse to grant respect to climate scientists in their own metier. You do know that these guys get PhDs in this stuff, right? And that they then work for 20-30 years to really reach the top of their game? Now why should I believe that you have reached such a pinnacle of expertise based on perusal of 300 papers–and I don’t even know which papers? Lahsen’s paper did not impress me much. Like so much of the crap published in STS (not all, mind you), it strikes me as rather shallow. She seems to have listened to the modellers only with an eye to quote mining rather than gaining understanding. Of course modellers have to take the output of their models with a grain of salt. Do you think that is news to them?
    Again, you seem to have complete contempt for any science that has no immediate application. You also have a misimpression that somehow the reality of the threat of climate change is contingent on climate models for its validity. This is not correct. You also misunderstand Spencer’s essay as supporting your position. We know the planet is warming. We know from basic physics that if you add CO2 to the atmosphere, the planet must warm. We know from paleoclimate that the sensitivity is likely somewhere between 1.5 and 5.5-6 degrees per doubling. Climate models place tighter limits on that range, and in so doing actually do more to limit risk than raise alarm. The models are also critical for looking at the implications of climate change, but again, here, their effect is more to limit risk than raise alarm.Without the models, we are flying blind in the Andes in a snow storm.
    Francois, it’s great that you read about climate. That you have reached a conclusion that climate change is not a serious threat would suggest your sampling has not been representative.

    Comment by Ray Ladbury — 28 Sep 2008 @ 3:45 PM

  541. Francois, #537.

    Provocative. Meaning: to provoke.

    Again you use sentences and then say you never meant them.

    I hope your research is more clearly thought out.

    Comment by Mark — 28 Sep 2008 @ 4:29 PM

  542. Francis,
    We can create simplified models of the effect of the greenhouse gases themselves. These can be described on less than one sheet of paper. They will give us the sign of the change and an indication of the order of magnitude. But they do leave out some feedbacks and do not give us results that we can use for policy decisions. The results of these simplified models are indicative only. They provide us with a proof of principle and ,to a lesser extent, with a reality check for the more complex models.

    But the climate is a complex system and some of the feedbacks such as circulation patterns are emergent properties than cannot be easily predicted from the initial model assumptions and data. These feedbacks can only be dealt with through complicated models.

    The sort of simplified models you are talking about will, if well chosen, give us the gist of what is happening. But they give biased results, underestimating temperature changes as it happens, though in principle the biases could have been in the other direction. Critics will and have queried the simplifications required to create very simple one-dimensional models. When you put in the factors whose effects people want to know, the models become complicated. It’s damned if you do and damned if you don’t. If you use the simplified models you want critics will rightly say they are not sufficiently realistic for us to use us a basis for action. If you make the models more realistic then people will say they are too complicated to understand. In this case you cannot create a simple model that explains what is going on and gives useful results. Would you really try to predict the response of a living cell to say, a short chain peptide that you have introduced, from a simple mathematical model? Fortunately the climate system is easier to model than that, but the point is sometimes back of the envelope calculations are of little use.

    As for the the climate model predictions not improving, if the point estimate of CO2 sensitivity of around 3ºC is close to correct then we expect that the point estimates will not change much with model improvements. Some of the range of possible values comes from uncertainties about cloud cover which have not improved much. Some, I believe, comes from non linearities in the model which make it hard to rule out extreme values of the sensitivity purely on the basis of the models. But to the extent that different estimates of sensitivity are independent, the fact that their point estimates and ranges are similar gives us some confidence that the true value of the sensitivity is close to the point estimates.

    Comment by Lloyd Flack — 28 Sep 2008 @ 4:48 PM

  543. Ray, nowhere in any of my posts here did I say that climate change was not a serious threat. Where did you find that? I guess that you assumed that if I criticized climate scientists as being disconnected from the real world, that meant that I thought every thing they did was crap. Far from it! Again, you assign intentions and opinions without quoting!

    I also never claimed that I had “special insight” into climate science. I do have an opinion, but I don’t claim that it is special. I just observe and comment based on my own experience and knowledge. Don’t all commenters here do the same?

    One other thing I did not say was that the reality of climate change is contingent on models. But for once you’ve read my mind correctly. And I don’t see how one could dispute that. All the predictions of climate change for the future are indeed the result of modeling. So all the debate on policy is indeed dependent on the model’s validity. I don’t see that as controversial or even provocative. Indeed, Spencer Wearts tells us that only models can give us answers. That there has been warming in the past is one thing. But we are not worried about past warming, only future warming. In fact, probably the reason why there are so many skeptics is that past warming has been far less than catastrophic, and in fact has probably been more beneficial than detrimental. When we are not faced with a concrete catastrophe, it can be difficult to imagine that one is coming. So it is only natural, and I think also quite sane, to question those predictions.

    Now as a last comment, I will ask how is it that I cannot comment on climate science, and that you can dismiss the work of all STS’ers, who also have Ph.D.s and also spend a lifetime to get to the top of their game. Do you have a special insight yourself? If you feel you can challenge their conclusions, then the game is open to everyone.

    Comment by Francois Ouellette — 28 Sep 2008 @ 4:57 PM

  544. Francois Ouellette (543) wrote “But we are not worried about past warming, only future warming. In fact, probably the reason why there are so many skeptics is that past warming has been far less than catastrophic, and in fact has probably been more beneficial than detrimental.”

    Past warming has led to world-wide glacier retreat and now Greenland melt. I’m concerned and you should be too.

    Past warming has certainly already become harmful in Bolivia (glacier retreat) and in various Pacific islands (sea level rise). Other examples abound; I dispute that “more beneficial than detrimental” is in any sense factual.

    Comment by David B. Benson — 28 Sep 2008 @ 6:41 PM

  545. Note, Francois, that I explicitly said that not all STS studies were garbage. I even have friends in the field. I do have a problem with some of the crap that has been produced by sociologists and anthropologists who know bupkis about the subject matter the scientists study and who do not even share a common language with them. I do not confine this criticism to studies of climate science. There was a real stinker of a study of particle phsicists about a decade ago.
    You claim that current warming is not a concern–and yet current warming is already threatening the polar sea ice, glaciers and permafrost (with all its methane and CO2).
    As to your opinion, you are entitled to it. Whether it is worthy of respect depends on whether it is informed. You’ve given no indication that it is. I take your failure to answer my query about whether you know any climate scientists as a negative. So you are apparently basing your opinion on sociological studies by authors who don’t really understand climate science or climate scientists either. Now given that climate science has been gone over with a fine-toothed come by physicists, chemists, meteorologists–hell even petroleum geologists–and not one professional scientific society or National Academy is in dissent, I don’t think your criticism is all that well founded. And of course, you’ve had nothing substantive to say about the science at all.

    Comment by Ray Ladbury — 28 Sep 2008 @ 7:17 PM

  546. Francois Oullette writes:

    Why would I have such limited experience of science? I’ve been on grant committees that covered the entire Information Technology field in Canada, I’ve reviewed hundreds of proposals. I’ve done due diligence for many technology-based startups, I’ve worked in biotechnology, and I’ve published a few dozen papers and reviewed many hundreds.

    Well, it’s kind of a clue that you don’t appear to understand the difference between science and technology.

    Comment by Barton Paul Levenson — 29 Sep 2008 @ 4:26 AM

  547. Francois Oullette writes:

    But those two latter fields do not make predictions on which important public policies are likely to be based. In fact the last time evolutionary biologists did that, it was quite a fiasco!

    Great, he’s a creationist, too. Why am I not surprised?

    Comment by Barton Paul Levenson — 29 Sep 2008 @ 4:30 AM

  548. Francois Oullette writes:

    All the predictions of climate change for the future are indeed the result of modeling.

    When Svante Arrhenius published the first estimate of global warming under doubled carbon dioxide in 1896, he did not use a computer model. Nor did G.S. Challenger in 1938.

    Comment by Barton Paul Levenson — 29 Sep 2008 @ 4:34 AM

  549. Francois, I’m curious: what situation are you referring to with “…last time evolutionary biologists did that, it was quite a fiasco…”?

    Comment by Rod B — 29 Sep 2008 @ 8:40 AM

  550. [edit]

    Ray Ladbury, there is an election here in Canada, as there is in the USA. I do not know personnally any of the candidates. That doesn’t prevent me from making a choice between them. Does one really need to have climate scientists as friends to understand how they work? I don’t think they work very diffently from most academic scientists, and I have plenty of experience with them. I mean, can we advance the debate or will it endlessly come back to my lack of qualifications? is this a job interview or what? Why don’t you detail how qualified you are yourself? I’m still waiting.

    Your argument that current warming is a concern does not hold. The Arctic melting today is not killing anyone. That it may be a sign of future dangerous melting if the trend continues is a concern. If one made a list of natural or man-made disasters that are killing people TODAY, climate change would be near the bottom. Wars, disease, poverty, etc. are much more harmful today.

    But you infer from this that I do not think that future climate change is not a concern, something I did not say. All I said is that it is difficult for many people to be concerned about something that does not affect them TODAY, and I’m not necessarily talking about myself. But the same can be said of many other disasters. How can one connect to the problem of malaria in Africa when you’ve never seen anyone sick with it?

    Actually, I think the hysterical proponents of AGW, who claim that harm is done today, do more harm than good. By constantly claiming there is immediate danger, and using bad examples like Katrina, or glaciers (who has ever seen a glacier anyway?), they inevitably make fools of themselves, and lose a lot of support from people who just think they are exaggerating. By the same token, the IPCC, by trying to claim exaggerated levels of certainty, just makes many suspicious of such claim. The acrimony with which any criticism is taken doesn’t help either. See how I’m treated here!

    I don’t see what’s wrong in saying that future climate change, the extent of which is suggested from models, is a cause for concern. That being said, the politicization of the debate is also a concern. And, in my view, the fact that most of the knowledge about it comes from a purely academic setting should induce caution. It is my opinion that the culture and dynamics of the academic research institution is not designed, and not ready, to bear the responsibility of giving the final advice about important public policies. I think we need to create other institutions (and the IPCC is a very bad example) to help us through this potential problem.

    Comment by Francois Ouellette — 29 Sep 2008 @ 8:50 AM

  551. Re Barton @548, should that be G.S. Callendar?

    Comment by Jim Eager — 29 Sep 2008 @ 9:33 AM

  552. When Svante Arrhenius published the first estimate of global warming under doubled carbon dioxide in 1896, he did not use a computer model.

    In fact, one could state that he used a “back of the envelope”-style calculation …

    Comment by dhogaza — 29 Sep 2008 @ 10:05 AM

  553. Francois, if you try to discuss climate science here you’ll have to learn to put up with the hounding of the gatekeepers: Ray, Hank, Barton, David, etc. Your credentials have not earned you the right to question the consensus. The time for discussing the science is past; it is time for action. Etc.

    Francois opened by suggesting that it is reasoanble to ask that these models by explained IN DETAIL, that THAT should be part of the mandate of IPCC. And look at the controversy he generates from such a simple and reasonable suggestion. Why do you all feel it is not necessary to explain these models? What makes you believe that Arrhenius’s approximations are an adequate model? Did Arrhenius correctly predict that temperatures would fail to rise in the early 2000s, despite the huge increase of CO2? Did the GCMs? Explanations, please.

    Comment by Richard Sycamore — 29 Sep 2008 @ 10:31 AM

  554. Francois Ouellette writes in #543:

    In fact, probably the reason why there are so many skeptics is that past warming has been far less than catastrophic, and in fact has probably been more beneficial than detrimental.

    Surely you aren’t really as ill-informed as you pretend to be? See here and here.

    Dave

    Comment by Dave Rado — 29 Sep 2008 @ 10:44 AM

  555. Richard Sycamore, #553 writes:

    … if you try to discuss climate science here you’ll have to learn to put up with the hounding of the gatekeepers … The time for discussing the science is past; it is time for action.

    Your statement is a mite disingenious given that you are posting on a blog devoted entirely to discussing the science. Almost none of the articles on this site discuss action, they almost all discuss science, and in considerable depth. Perhaps you should try reading the articles here before you post again?

    and Richard Sycamore also write:

    Did Arrhenius correctly predict that temperatures would fail to rise in the early 2000s, despite the huge increase of CO2? Did the GCMs? Explanations, please.

    See here and here.

    For someone who claims to be interested in the science, you are astonishingly badly informed about it.

    Dave

    Comment by Dave Rado — 29 Sep 2008 @ 11:17 AM

  556. Francois, one needn’t have climate scientists as friends, enemies or even acquantances. Rather, one needs to know how they work, what their motivations are and how they communicate. By lumping all into the same bin, I would contend that you demonstrate a profound ignorance of the culture of not just climate science but of most sciences in general. The culture of particle physics is quite distinct from space science or condensed matter physics. And all of these are distinct from biology, ecology and so on. It is your contempt for “academic science” that renders your opinion suspect.
    As to your characterization of the scientific community as hysterics (or do you know of a respected scientific professional or honorific society that has come down on the opposite side of the consensus?), again, you do not seem to understand the process. The “academic science” has identified an effect and shown with reasonable confidence that it poses a threat. That is not alarmism, but rather the first step in risk mitigation: IDENTIFY THE THREAT. Only then can we evaluate and eventually remediate either the consequences or probability of occurrence.

    You say that the “academic” nature of climate science makes it suspect. OK, Francois, where else would we look for understanding? Who but those trained in climate science have actually studied and advanced understanding of the issue?

    And you have the temerity to come on here and accuse the entire “academic science” community of either fraud or incompetence and then complain: “See how I’m treated here!” Oh, PLEASE! Maybe if you started out with respect rather than comtempt, you might have a friendlier reception.

    Now, here is a clue to you and your buddy Richard Sycamore. Anyone who wants to can challenge the consensus. Anyone who does so outside the forum of peer-reviewed scientific literature is wasting his or her time. That is where the consensus was forged and that is the only place where it will change. Don’t like the consensus? Great. Then publish something that advances the understanding of the subject and challenges that consensus. Otherwise you are wasting your time and that of the people who actually come on here to learn the science.

    Comment by Ray Ladbury — 29 Sep 2008 @ 11:38 AM

  557. Francois opened by suggesting that it is reasoanble to ask that these models by explained IN DETAIL, that THAT should be part of the mandate of IPCC. And look at the controversy he generates from such a simple and reasonable suggestion. Why do you all feel it is not necessary to explain these models?

    So why doesn’t he just download the source to one and study it? No one’s hiding anything.

    Comment by dhogaza — 29 Sep 2008 @ 11:46 AM

  558. dhogaza,
    If the IPCC’s function as literature reviewer is legitimate and valuable – which I think it is – then why would a contribution synthesizing GCM science not be equally welcome? Sure, anyone can download the code and “study it”. I don’t see how that is likely to lead to a consensus on what the codes say. Do you? Finally, you say “no one is hiding anything” as though someone suggested that something is being hidden. Intent to hide is not required for errors to lie hidden undiscovered. I agree with Francois. I would like to see an attempt at the sort of summary that Spencer Weart suggests is impossible. Perhaps you disagree. However neither view merits the sort of treatment that Francois is getting.

    Comment by Richard Sycamore — 29 Sep 2008 @ 12:07 PM

  559. > gatekeeper

    Bogus. A lie, from a sock puppet at CA (yours?).
    A few fools believed it.

    The ‘Contributors’ who manage RC are listed. I’m not one.
    I’m a reader, with no ‘gatekeeper’ power on this or any weblog.

    I urge people to learn for themselves, check what people claim against original sources, check references.

    If that scares you away from learning, I’m doing it wrong. I’ll try to be gentler with you hereafter.

    Comment by Hank Roberts — 29 Sep 2008 @ 12:20 PM

  560. “public policies” … “the last time evolutionary biologists did”

    You don’t believe antibiotic resistance evolves? That caution is the most recent recommendation I find. Got some newer example?

    Comment by Hank Roberts — 29 Sep 2008 @ 12:44 PM

  561. http://www.climatescience.gov/Library/sap/sap3-1/final-report/default.htm
    Climate Models: An Assessment of Strengths and Limitations
    Final Report, Synthesis and Assessment Product 3.1

    See also press release (dtd 31 July 2008) and brochure from the Department of Energy.

    SAP 3-1 Final Report Cover Climate Models: An Assessment of Strengths and Limitations. A Report by the U.S. Climate Change Science Program. [Bader, D.C., Covey, C., Gutowski, W.J., Held, I.M., Kunkel, K.E., Miller, R.L, Tokmakian, R.T., Zhang, M.H., (Authors)]. U.S. Department of Energy, Washington, DC, USA.

    Comment by Hank Roberts — 29 Sep 2008 @ 1:45 PM

  562. Richard, #558.

    I think the problem is that this questionaire will not change anything. Those who think AGW is right will see it proving that. Those that think AGW is wrong will see proof for them in it.

    A better system would be to put up some of the statements pro and anti AGW and asking for what they rate them to accuracy, irrelevancy and lunacy.

    But they never ask me. And they have the gall to say “our survey showed that nobody wanted cheap medium sliced bread, so we got rid of it!”.

    Comment by Mark — 29 Sep 2008 @ 1:49 PM

  563. Brilliant post.

    Thank you, Spencer!

    Comment by Peter Carbonetto — 29 Sep 2008 @ 4:40 PM

  564. #561
    Thank you for the reference, although I’ve read it before and it merely re-asserts what Weart asserts above. Specifically:

    “Climate sensitivity is not a model input. It emerges from explicitly resolved physics, subgrid-scale parameterizations, and numerical approximations used by the models—many of which differ from model to model—particularly those related to clouds and ocean mixing. The climate sensitivity of a model can be changed by modifying parameters that are poorly constrained by observations or theory.”

    Francois is not asking for an assessment of model strengths and limitations. He is asking for an exposition of the models’ core calculation of climate sensitivity. Spencer Weart asserts, not that this has already been done, but that this is not possible – due to the emergent nature of the calculation. So your reference very much sidesteps the question. But thank you all the same.

    Comment by Richard Sycamore — 29 Sep 2008 @ 4:47 PM

  565. #555 Those, err “sources” fail to make your argument, unfortunately.

    But I want to thank Hank #561 for the relevant reference, which clarifies the important difference betweeen “equilibrium sensitivity” and “transient climate response”:

    “Equilibrium sensitivity is defined as the long-term near-surface temperature increase after atmospheric carbon dioxide has been doubled from preindustrial levels but thereafter held constant until the Earth reaches a new steady state, as described in the preceding paragraph. Transient climate response or TCR is defined by assuming that carbon dioxide increases by 1% per year and then recording the temperature increase at the time carbon dioxide doubles (about 70 years after the increase begins). TCR depends on how quickly the climate adjusts to forcing, as well as on equilibrium sensitivity. The climate’s adjustment time itself depends on equilibrium sensitivity and on the rate and depth to which heat is mixed into the ocean, because the depth of heat penetration tends to be greater in models with greater sensitivity (Hansen et al. 1985; Wigley and Schlesinger 1985). Accounting for ocean heat uptake complicates many attempts at estimating sensitivity from observations, as outlined below.”

    This helps answer my question of why Arrhenius’s calculation of equilibrium sensitivity might not be relevant to the transient climate response observed over a given short interval. More specifically:

    “Equilibrium sensitivity depends on the strengths of feedback processes involving water vapor, clouds, and snow or ice extents (see, e.g., Hansen et al. 1984; Roe and Baker 2007). Small changes in the strengths of feedback processes can create large changes in sensitivity, making it difficult to tightly constrain climate sensitivity by restricting the strength of each relevant feedback process. As a result, research aimed at constraining climate sensitivity—and evaluating the sensitivities generated by models—is not limited to studies of these individual feedback processes. Studies of observed climate responses on short time scales (e.g., the response to volcanic eruptions or the 11-year solar cycle) and on long time scales (e.g., the climate of last glacial maximum 20,000 years ago) also play central roles in the continuing effort to constrain sensitivity. The quantitative value of each of these observational constraints is limited by the quality and length of relevant observational records, as well as the necessity in several cases to simultaneously restrict ocean heat uptake and equilibrium sensitivity. Equilibrium warming is
    directly relevant when considering paleoclimates, where observations represent periods that are very long compared to the climate’s adjustment time. The transient climate response is more directly relevant to the attribution of recent warming and projections for the next century.”

    The GCMs are loaded with assumptions that are, as the quotes above indicate, worthy of discussion and study.

    As for the idea that “outsiders” do not have the necessary skill to overturn any of the assumptions amongst insiders, I think that conventional wisdom has been shown to be false in some exceptional cases. But I’ll spare you the examples that readily come to mind.

    Comment by Richard Sycamore — 29 Sep 2008 @ 5:08 PM

  566. Francois Ouellette — See my comment #544. Other harms include to ski resorts, likely Nepalese peasants, and probably eastern Europe. The Tibetan glaciers are doomed which will result in harms to Chinese.

    While no individual hurricane can be directly attributed to current global warming, it seems that increased average intensity can.

    Hadley Centre has written that global warming will result in more extreme precipitation events. Chile had a drought last summer and two serious floods last winter, rather rare in the historical record.

    Comment by David B. Benson — 29 Sep 2008 @ 5:08 PM

  567. “Ray, i don’t see how I was agressive. Provocative, maybe.”

    Translation: “What? Little ol’ me behaving like a troll [flutters eyelashes]?”

    Comment by Jonick Radge — 29 Sep 2008 @ 5:58 PM

  568. All the climate models that we create are wrong, both the the simple back of an envelope ones and the complicated GCMs. The question is which ones are useful and for what purposes?

    The GCMs are designed to help give a detailed understanding of the climate and to allow useful long term predictions to be made. They are not needed to show that AGW is happening. The pattern of the trends is sufficient for that. They are needed to help put numbers to the CO2 sensitivity. They are not the only source of such figures. For example, we can make independent estimates from the current climate temperature record and from the Last Glacial Maximum.

    Unfortunately these models are very complicated and it is hard for someone not highly familiar with the subject matter to check them and see whether they are doing something plausible. They look like black boxes even though they are not.

    Simpler models can be created which show the general outline of what is going on. These are much easier to understand and check. They can include reasonable approximations of the effects of some of the main drivers of trends. But unfortunately, not of all important drivers. There will be emergent phenomena that will just not show up in simple models, for example circulation effects. We can put in our estimates of the average values of such effects but critics will quite reasonably question these estimates. No one has created a simple model that cannot be criticized as leaving out something important or including a challengable estimate of some parameter. And they are biased.

    For reliable predictions of temperature sensitivity we have use complicated models. We have a complicated system.

    And I agree with Francis that attributing harm happening now to AGW is a bit of a stretch. Maybe some of the droughts and floods are a result of AGW but I have not seen evidence that is more than strongly indicative of this. It is not a good idea to attribute current weather disasters to AGW. Most of them won’t be a result of AGW and this can lead to a boy who cried wolf situation. We have serious problems heading towards us and have to act now because we are dealing with systems with a large inertia and that are not completely reversible. We have to act before the problems are serious and we have to get the message across that waiting for obvious problems is a bad idea. We cannot afford to wait till it is obvious even to those looking for reasons to believe it is not happening.

    Comment by Lloyd Flack — 29 Sep 2008 @ 8:32 PM

  569. Dave (554), what do you make of the quote from one of your linked references, “…planktonic foramanifera diversified, and dinoflagellates bloomed. Success was also enjoyed by the mammals, who radiated profusely around this time” in the context of this discussion?

    Comment by Rod B — 29 Sep 2008 @ 9:18 PM

  570. F.O. writes:

    All I said is that it is difficult for many people to be concerned about something that does not affect them TODAY, and I’m not necessarily talking about myself.

    Tell it to the Australians.

    Agricultural lands worldwide used to be 20% in drought at any given time during the ’50s and ’60s. That figure is now 30%. People die from reduced food production. That’s happening NOW.

    Comment by Barton Paul Levenson — 30 Sep 2008 @ 4:07 AM

  571. Jim –

    Yes, I got the name wrong! It is G.S. Callendar. Sorry about that.

    Comment by Barton Paul Levenson — 30 Sep 2008 @ 4:08 AM

  572. Richard Sycamore writes:

    Francois opened by suggesting that it is reasoanble to ask that these models by explained IN DETAIL

    You mean you want the math and the source code? Well, I’d start with some books on atmosphere radiation like J.S. Houghton’s “The Physics of Atmospheres” (3rd ed. 2002), Grant W. Petty’s “A First Course in Atmospheric Radiation” (2nd ed. 2006), and Goody and Yung’s “Atmospheric Radiation” (2nd ed. 1989). Then add Henderson and Robinson’s “A Climate Modeling Primer” (1986). And look up and read some classic papers on the subject, like Manabe and Strickler 1964, where they introduce the first modern radiative-convective model, and Manabe and Wetherall 1967, where they substantially improved the first model by adding changes such as assuming fixed relative humidity with altitude rather than fixed absolute humidity.

    Oh, and, of course, do all the problems in the books mentioned above.

    Some books on climatology would also be useful, as you need more than atmospheric radiation to write a GCM. Henderson-Sellers and Robinson’s “Contemporary Climatology” (1987) is a good place to start, and don’t neglect Sellers’s classic “Physical Climatology” (1965), though you’ll have to translate the English units into metric. Hartmann’s 1994 “Global Physical Climatology” is also a good one. And our own Raymond Pierrehumbert (“Raypierre,” il dit) has a pretty good climatology primer online:

    http://geosci.uchicago.edu/~rtp1/ClimateBook/ClimateVol1.pdf

    You’ll also need to know a procedural computer language such as Fortran, and C/C++ and Pascal/Delphi can also be used. Stay away from interpreted languages and GUI languages, as they concentrate mostly on providing a pretty interface, whereas for a simulation you want speed, speed, and more speed. Ray uses Python, which I personally wouldn’t touch with a ten-foot pole, but if you’re not doing an actual climate simulation it works well enough.

    I’m writing a book on how to write RCMs, but it probably won’t be finished for another year or so.

    Comment by Barton Paul Levenson — 30 Sep 2008 @ 4:19 AM

  573. Ray Ladbury,

    This is getting tiresome. All you can repeat is that I have “profound ignorance of the culture of not just climate science but of most sciences in general”, despite having demonstrated that I worked for a number of years in academia, and attained quite a high level there. Even now, 10 years after I left and stopped publishing, I am still asked about once a month to review a paper. Some people somewhere must still believe that I have some understanding of science. I’m still waiting for a list of your credentials. Does being an engineer working on damage to electronic components give you such a special insight into the scientific process, and the culture of biologists?

    And you make further interpretations of what I said. Where did I say that there was “fraud and incompentence”. I don’t see any of those two words in all the comments I made. I said that the academic system of peer-reviewed publications had flaws and virtues, something that is not very controversial, and not even provocative. Why would there be entire conferences that specifically deal with peer-review if the process was perfect? So the question is: are the flaws in the process a potential problem when it comes to supplying scientific opinion on matters of public policy? We all know about the “consensus” mantra. But if the process that leads to a consensus is flawed, then the consensus is not worth anything. So we must pay attention to this too. There was probably a consensus amongst NASA engineers that the Space shuttle was safe, and then it exploded, and Richard Feynman, who was totally independent and not an expert in aeronautics, discovered that there were many flaws in how that consensus was forged. The “culture” was wrong. So it is an issue. Jumping up and down and shouting “consensus! consensus!” does not help. There are many examples where the consensus was wrong (as there are many when it was right).

    Do I have a solution? I wish I had one! I have proposed elsewhere that our governments should have set up an independent lab (independent from the academic publication system), hired the best minds in the field, implemented a vast program of detailed observations on climate, and put them to work. Ideally, you would even want a second lab, independently of the first, and see if they come to the same conclusion. So a sort of Manhattan project for climate. Would that work? Hell, I have no idea. But seeing that our range of estimates for CO2 sensitivity does not seem to narrow year over year, maybe it’s time to try something new.

    You know, as part of my little experience that you despise so much, I was once part of a major corporation developing fiber optic networks. The founder had hired the “best minds in the field”, litterally, and given them a big research budget and all the equipment they needed. Dare I say that I was among them? I only say it because you always question my credentials, so I feel the need to pile on. Anyway, after only a few months, we had results that far surpassed what you could find in “academic” papers. We could look at all those post-deadline papers and laugh at them. So that was the result of removing the constraints of academic research, whether it be budget constraints, the need to publish, the teaching requirement, the peer pressure. It was also achieved by focusing on a well defined problem. And no one had time to run a blog… Now before you accuse me again of profound ignorance, let me just say that I do not pretend that it would work in this case. But why not think about it? Why not give it a try?

    Comment by Francois Ouellette — 30 Sep 2008 @ 8:25 AM

  574. > 554, 569, past warming good
    Rod, you’ve let slip that you think and read and aren’t dumb.
    If past warming meant the last 100 years, damage is in the pipeline.
    If past warming meant the PETM and other extremes, examples aplenty.
    Diverting the thread into confused blather? Priceless.

    Comment by Hank Roberts — 30 Sep 2008 @ 10:16 AM

  575. #572 And can you be a little more specific about the derivation of 450ppm CO2 as the alleged tipping point toward runaway feedback? We all know the source and the date of this estimate. The question is the arithemtic behind the derivation. The answer is not contained in your 1980s textbooks.

    Comment by Richard Sycamore — 30 Sep 2008 @ 10:19 AM

  576. Lloyd Flack (568) — Sea level rise is edue to AGW. Already adversly affecting some Pacific islanders. Glacier melt is due to AGW. Already adversely affecting peasants in Bolivia.

    Comment by David B. Benson — 30 Sep 2008 @ 1:39 PM

  577. Re #575: Could you be specific about what recent credible scientific source states that 450 ppm is a tipping point toward runaway feedback?

    450 ppm is certainly the value that many scientists have chosen to delineate “dangerous anthropogenic interference” (DAI). Which does involve some amount of judgment call. And also acknowledgment of uncertainty: after all, depending on feedback 450 ppm can lead to 1 degree C to 4 degrees C warming or more (see Ramanathan and Feng, PNAS, 2008 for an analysis of committed warming after we’ve stripped away aerosol cooling). And there isn’t yet a consensus estimate of how much damage a given warming causes, though again, 2 degrees C has been chosen by many scientists and politicians as a value to delineate DAI. I think what is widely agreed on is that damage increases with change in temperature from preindustrial above some minimal value (say 1 degree), and that it is likely to be non-linear relationship, with the damage curve getting steeper the more you change the system from preindustrial.

    But that is very different from “runaway feedback”. Which has been out of vogue among climate scientists for longer than I’ve been in the field (about a decade).

    Comment by Marcus — 30 Sep 2008 @ 1:51 PM

  578. #572

    Whereas Spencer Weart suggests that an engineering quality exposition of the climate sensitivity derivation is not possible, you seem to assert that it is not only possible, but that it has already been done, it’s just distributed amongst a number of textbooks. Which is it? Supposing you are correct, do you think IPCC should be charged with the synthesis of this material? Or should each attempt his own synthesis?

    [Response: You are misreading the entire post, the issue in which is whether 'simple (quantitative) answers' can be given to simple questions about complex systems. They can't and so complexities are built into databases (HITRAN), radiative transfer models and GCMs. If that's what you want, but you are not prepared to get stuck in to the details, you are perpetually going to be dissatisfied. - gavin]

    Comment by Richard Sycamore — 30 Sep 2008 @ 1:55 PM

  579. Seems to me that Barton Paul Levenson and Francois O agree – that a synthesis of the climate sensitivity derivation ought to be possible … despite the significant challenges outlined by Dr. Weart.

    Comment by Richard Sycamore — 30 Sep 2008 @ 2:00 PM

  580. > alleged tipping point toward runaway feedback?

    Straw man.

    Comment by Hank Roberts — 30 Sep 2008 @ 2:04 PM

  581. http://scholar.google.com/scholar?sourceid=Mozilla-search&q=%2Bclimate+%2B%22tipping+point%22+%2B%22runaway+feedback%22

    Comment by Hank Roberts — 30 Sep 2008 @ 3:51 PM

  582. #576 David Benson,

    Could you give me a reference to the Bolivian glacial melt. Mountain glacial retreat and its affects on freshwater supplies are among the first unequivocal adverse effects of AGW that I would expect.

    Sea level rise is going to be blamed often when the true problem is local subsidence. We have to disentangle these effects. Which Pacific islands are affected, how much is the claimed local sea level rise, and what is the estimated rise in that part of the Pacific as a result of AGW? What are the IPCC estimates of sea level rise in that region?

    #573 Francois Ouellette,

    I wonder whether the reason why the range of sensitivity estimates has not been shrinking is so much a problem with the models, as it is a problem with the analysis of the output of the models.

    The point estimate is near 3ºC for doubling of CO2. If the true value is near the initial point estimate then I would not expect the point estimate to change much. I think this is what has been happening.

    As I understand it, the range of the estimates for sensitivity comes from the ensemble of estimates from models with different initial assumptions. Unless you narrow the range of model assumptions you are likely not to narrow the range of the predictions. For reasons of computational tractability the models are built on a coarser spatial scale than what would be required to give good estimates of the feed back from clouds. Nonlinearities in the models can lead to large changes in the models for for small changes in the assumptions. In other words as I see it there are two ways to narrow the range of the model output. One is to have a lot more computing power than we have available now so we can model phenomena on a finer scale. The other is to narrow the range of the initial assumptions for the models.

    However there is another way to narrow the range of the sensitivity estimates. This is to combine the results from different analyses in a meta-analysis. Annan and Hargreaves have done this for three sensitivity estimates and come up with a narrower confidence interval than the initial estimates. Further extension of this should allow more precise estimates to be made of CO2 sensitivity. To do this we need people who are both familiar with the models and their properties and with the required statistical methods, not a common combination. To combine different analyses we need to know the degree of dependency among them, else we could get an over optimistic narrowing of the confidence interval by combining the data sources.

    In summary, I think, to narrow the range of estimates of CO2 sensitivity, the best ways are to either improve the information fed into the models or the improve the statistical analysis of the output. Improving the models themselves will probably have less benefit. We probably could get better estimates from the available models output by doing improved analyses.

    Comment by Lloyd Flack — 30 Sep 2008 @ 5:14 PM

  583. Richard, #579.

    Question: it is possible. Is it worth it?

    Comment by Mark — 30 Sep 2008 @ 5:48 PM

  584. Straw man?!

    http://www.greenpeace.org.uk/files/pdfs/climate/hansen.pdf

    “The single most pertinent number emerging from Cenozoic climate studies is the level of atmospheric CO2 at which ice sheets began to form as the planet cooled during the past 50 million years. Our research suggests that this tipping point was at about 450 ppm of CO2 (http://arxiv.org/abs/0804.1126 and http://arxiv.org/abs/0804.1135).”

    “If humanity is so foolish as to burn all fossil fuels, thus more than doubling atmospheric CO2 from its pre-industrial level of 280 ppm, we will have set the planet on an inexorable course to an ice-free state, with all the disasters that such a course must hold for man and beast.”

    According to my inexpert read of Hansen, the only thing preventing a GHG-driven runaway at the PETM was the smashing of India into Asia:

    “The imbalance of carbon sources and sinks (thus the change of atmospheric CO2) depends upon plate tectonics (continental drift), because it is the rate of subduction of carbonate-rich ocean crust beneath moving continental plates that determines the rate of volcanic emission of CO2. Also the rate of weathering (the primary long-term sink of surface carbon) is a function of the rate at which fresh rock is exposed by mountain building associated with plate tectonics.

    Specifically, during the period 60 My BP (60 million years before present) to 50 My BP India was plowing north rapidly (20 cm per year) through the Tethys Ocean and in the process subducting carbonate-rich ocean crust, causing atmospheric CO2 to increase. Global temperature peaked 50 My ago when India crashed into Asia. Available proxy measures of CO2 indicate that atmospheric CO2 reached 1000-2000 ppm at that time. The Earth was at least 12°C warmer than today, there were no ice sheets on the planet, and sea level was about 75 meters higher.

    With the collision of India and Asia the subduction source for CO2 emissions declined, but the weathering sink increased as the Himalayas and Tibetan Plateau were pushed up. Thus the past 50 My have generally
    been a period of declining atmospheric CO2 and a cooling planet.”

    Re #578 A complex answer is ok. I agree it would be silly to expect something one-page-simple. I’m willing to read a 300-page appendix if the answer is in it.

    Comment by Richard Sycamore — 30 Sep 2008 @ 6:57 PM

  585. Rod B, #569, are you really advocating a repetition of an event that killed off 90% of animal and plant species on the basis that as a result the other 10% were able to prosper? Or are you just being argumentative for the sake of it?

    Comment by Dave Rado — 30 Sep 2008 @ 8:25 PM

  586. Dave Rado,
    Aren’t you confusing the PETM with the End-Permian mass extinction. That had an over 90% extinction rate (at leat for marine life forms) The PETM while it did increase the extinction rate was nowhere near that severe.

    Comment by Lloyd Flack — 30 Sep 2008 @ 9:26 PM

  587. That’s true, Hank (574); it was unclear what “past warming” they were referring to.

    Comment by Rod B — 30 Sep 2008 @ 9:39 PM

  588. Dave (583), neither. I was just quizzical about a link referenced as solid support for the damage of warming containing phrases like “…mammals flourished…”

    Comment by Rod B — 30 Sep 2008 @ 9:47 PM

  589. #583
    51% of the population probably think so. But that’s just a guess.
    I suppose you don’t like the idea of Bray & von Storch’s survey either?

    Comment by Richard Sycamore — 30 Sep 2008 @ 10:51 PM

  590. #584 Richard Sycamore,
    I think you are actually referring to the Eocen Optimum rather than the Paleocene-Eocene Thermal Maximum. The PETM was an earier brief (

    [Response: don't use a raw < symbol, use & l t ; instead (with no spaces). - gavin]

    Comment by Lloyd Flack — 1 Oct 2008 @ 4:36 AM

  591. #584 Richard Sycamore:

    Straw man?!

    The “straw man” accusation refers to your use of the expression “runaway feedback”. Either you don’t know what you’re talking about, or you’re using the wrong terminology. A “runaway feedback”, i.e., one over 100% leading to a Venus-like loss of the oceans, is not going to happen on Earth for another billion years or so.

    But Marcus in #577 explained that already.

    Comment by Martin Vermeer — 1 Oct 2008 @ 8:37 AM

  592. Lloyd Flack,

    I understand your point. I’ve read a few papers on models and all, but I will not claim here to have the most up-to-date expertise. I don’t want to engage in too technical a debate. So what follows is really just the opinion of someone who is scientifically litterate, and is more about methodology than theory.

    You say: “to narrow the range of estimates of CO2 sensitivity, the best ways are to either improve the information fed into the models or to improve the statistical analysis of the output.” That makes a lot of sense, but I guess as an experimental physicist, I would be inclined to go for the first choice. I would be very much wary of using too much statistics just to study the output of models. At some point, all this has to relate to the real world. An external observer cannot but be surprised by such techniques as using ensemble averages of models as if it represented some kind of “better” estimate of the real world. Despite all the justifications, to me it remains a dangerous course. On the other hand, what I retain from my readings is how difficult it is to really compare the outputs of models to the actual world, and in large part this is due to a lack of comprehensive real world data, and also to the fact that the numbers that the models give you do not have a simple correspondence with what is actually measured. But in the end, it is essential to improve the correspondence between the models and the real world. That is the essence of modelling. That’s why I believe that there should be a much greater effort at collecting data, and using them to improve the models. There should also be more effort at figuring out better metrics to assess the performance of models. What I’ve seen is quite rudimentary. It’s not a simple problem, and maybe outside expertise is needed here.

    But also, when I was talking in my earlier post of a “focused problem”, it does seem to me that the real sticking point is to have a more precise estimate of the feedback parameter, and in particular water vapor and cloud feedbacks. But we can’t just rely on models, we have to measure them! It seems to me that a “focused effort” here, to try and empirically measure those numbers, would be appropriate. Imagination is required to devise new techniques.

    I think what is impeding this mostly empirical and experimental program is purely a budgetary constraint. It costs much more to send new satellites or set up some global monitoring system than to pay modellers and buy more powerful computers. And also models can churn out results much more rapidly, and appear to be more efficient. But that is pervasive in all sciences. As an experimental physicist, I know that my colleagues who were fiddling with purely theoretical problems, and solving them numerically, could publish many more papers. Setting up and debugging an experiment takes a lot more time than debugging software, and costs a lot more. But in the end nothing beats an experimental demonstration, if only because experiments always surprise you and more often than not make you revise your initial assumptions.

    Comment by Francois Ouellette — 1 Oct 2008 @ 8:51 AM

  593. #584: Richard Sycamore, the 450 ppm threshold that Hansen is referring to should be considered a “tipping point” or a “threshold” more than a “runaway feedback”. It is a tipping point which also contains a positive feedback. So, hypothetically, there may exist an anthropogenic GHG forcing at which the Greenland ice sheet (GIS) is mostly stable, but where an additional 0.1 W/m2 of GHG forcing would lead eventually to the total disintegration of the GIS, and this disintegration of the GIS would contribute additional albedo forcing. So, hypothetically, there might exist a CO2 concentration (say 450 ppm) where temperature increase is one value (say 2 degrees). But at 455 ppm, due to crossing the GIS tipping point the melting would lead to albedo changes leading to a significantly larger global temperature increase (say 4 degrees). This isn’t actually a “runaway” feedback, despite occasional terminology confusion among one or two posters on this site (Lynn comes to mind).

    That there should exist such a threshold for the GIS is, I believe, fairly well accepted: the GIS is only stable because the height of the ice combined with the lapse rate of the atmosphere ensures that the top of the sheet is cool enough not to melt, whereas if the GIS did not exist, the temperature at ground level would _not_ be sufficient to maintain an icesheet (see, also, hysteresis). I don’t personally know what the change in albedo feedback would be, and therefore what temperature change would result: however we do know that the disappearance of the GIS would directly lead to 7m of sea level rise, which is I think what most people worry about.

    The exact value of the GIS threshold is not as well accepted. Hansen proposes 450 ppm based on historical evidence. Given that conditions today are not exactly equal to historical conditions (solar influence, oceanic currents, other GHGs, atmospheric patterns, etc) historical analogues can only be inexact proxies. The threshold could be more than 450 ppm (perhaps much more) though it could also be less. I think that our understanding of ice sheet dynamics is, unfortunately, not yet at the point where we can make a good “bottom-up” estimate of the critical temperature threshold for ice sheet stability.

    Also, the timing of this event is important: it is likely to be quite slow (see the Pfeffer work) and therefore we might have a window between seeing evidence having crossed the threshold in which to reduce forcing quickly and cross back before it is too late (possibly using geoengineering).

    Of course, geoengineering is likely to produce its own drawbacks. As would radical restructuring of the economy in a short time period. Therefore, judicious reduction of GHG emissions now would seem to be a preferred option to waiting until we are forced to make precipitous reductions later.

    Comment by Marcus — 1 Oct 2008 @ 9:37 AM

  594. > what is impeding this mostly empirical and experimental
    > program is purely a budgetary constraint.

    The problem was Triana did get built but did not get launched. It’s sitting in a warehouse.

    Put an instrument in the right place and it will be able to measure, all the time, both sunlight reaching Earth and heat leaving Earth from the same point of view. NO multiple different satellite instruments in various orbits for various periods of time. One measurement of the whole sunlit face of the planet, constantly.

    The instrument would have provided the information needed to resolve the open questions that we’ve been instead trying to piece together from fragmentary instrument records.

    It wasn’t a budget problem that kept the satellite on the ground.
    Look it up.

    Comment by Hank Roberts — 1 Oct 2008 @ 10:25 AM

  595. Lloyd Flack (582) — The Bolivean glaceir problem was in a DailyScience news story; I didn’t keep a link. The SLR in the Pacific is real, primarily due to warmer water.

    There are similar glacier problems already in Nepal, if I remember correctly.

    Comment by David B. Benson — 1 Oct 2008 @ 1:59 PM

  596. http://scholar.google.com/scholar?hl=en&lr=&safe=off&scoring=r&q=CHACALTAYA+glacier&as_ylo=2007

    Comment by Hank Roberts — 1 Oct 2008 @ 6:52 PM

  597. Could someone clarify for me the Hansen statement on the cooling effect of continental collision? Is he suggesting that runaway warming was prevented by this happy accident? And if so, does that not imply that that is what is necessarily in store for us beyond the 450ppm tipping point?

    The more substantive the reply, the better.

    Comment by Richard Sycamore — 2 Oct 2008 @ 9:05 AM

  598. The more new rock exposed, the more CO2 gets weathered out of the air and downstream to the sea.

    Comment by Barton Paul Levenson — 3 Oct 2008 @ 3:46 AM

  599. Weathering those silicate rocks manually, or anyway with hand-made machinery, turns out to be a relatively low-energy-cost way of sequestering CO2, as previously discussed.

    Comment by G.R.L. Cowan, H2 energy fan \'til ~1996 — 3 Oct 2008 @ 9:15 AM

  600. Richard Sycamore #597,

    what Barton said. And this is your starting link:

    http://maureenraymo.com/uplift_overview.php

    …and no, no “runaway”. Just “tipping point”.

    Comment by Martin Vermeer — 3 Oct 2008 @ 9:30 AM

  601. > the cooling effect of continental collision? Is he suggesting
    > that runaway warming was prevented by this happy accident? And
    > if so, does that not imply that that is what is necessarily in store

    Too much confusion thre to parse.
    That was _not_ a geoengineering proposal by Hansen.
    It was an example of one form of biogeochemical cycling, q.v.

    A few minutes’ searching turned up a few hundred sources.
    Here are two as examples.
    As always read the footnotes, and after time passes look for citation by subsequent authors.

    http://www.nature.com/nature/journal/v445/n7128/full/nature05516.html#B1

    http://www.pnas.org/content/early/2008/09/22/0805382105.abstract
    Published online before print September 22, 2008,
    doi: 10.1073/pnas.0805382105
    Abstract

    India’s northward flight and collision with Asia was a major driver of global tectonics in the Cenozoic and, we argue, of atmospheric CO2 concentration (pCO2) and thus global climate. Subduction of Tethyan oceanic crust with a carpet of carbonate-rich pelagic sediments deposited during transit beneath the high-productivity equatorial belt resulted in a component flux of CO2 delivery to the atmosphere capable to maintain high pCO2 levels and warm climate conditions until the decarbonation factory shut down with the collision of Greater India with Asia at the Early Eocene climatic optimum at ≈50 Ma. At about this time, the India continent and the highly weatherable Deccan Traps drifted into the equatorial humid belt where uptake of CO2 by efficient silicate weathering further perturbed the delicate equilibrium between CO2 input to and removal from the atmosphere toward progressively lower pCO2 levels, thus marking the onset of a cooling trend over the Middle and Late Eocene that some suggest triggered the rapid expansion of Antarctic ice sheets at around the Eocene-Oligocene boundary.

    Comment by Hank Roberts — 3 Oct 2008 @ 9:52 AM

  602. #600
    You’re begging the question. I was asking if temperature was running away at the time of that collision. I understand the putative “decarbonation” cooling mechanism. That is not the question. You assert that temperature was not running away. What’s the proof?

    #601
    As above. Hundreds of sources are not required to answer one simple question.

    Comment by Richard Sycamore — 4 Oct 2008 @ 7:20 AM

  603. > temperature was not running away. What’s the proof?

    You’re here, typing. That’s the proof. Strong anthropic climatology.

    We have seen excursions, within a livable range (water in all three states, vapor, ice, and liquid, has been present at all times).

    Comment by Hank Roberts — 4 Oct 2008 @ 3:12 PM

  604. Liquid water (the “Goldilocks Question”
    here, this is one of many sources:

    Ch. 13 (Kasting): Runaway greenhouses and runaway glaciations: how stable is Earth’s climate?

    http://books.google.com/books?hl=en&lr=&id=peOHcKtQxZAC&oi=fnd&pg=PA349&dq=did+runaway+climate+change+happen%3F&ots=AOSEU6uRBX&sig=TX7_x24sRKtABMgguwTDScRfLFk

    That’s a link to the picture of the first page of Ch. 13 from:

    Frontiers of Climate Modeling
    By Jeffrey T. Kiehl, Veerabhadran Ramanathan
    Cambridge University Press, 2006
    ISBN 0521791324, 9780521791328

    Comment by Hank Roberts — 4 Oct 2008 @ 3:28 PM

  605. Simpler way to answer this — look in Dr. Weart’s History.

    Click the link provided below to search there on “runaway”

    The first hit there explains what the word means; see the references.

    http://www.aip.org/servlet/SearchClimate?collection=CLIMATE&queryText=runaway&SEARCH-97.x=0&SEARCH-97.y=0

    Here’s a 1988 Sci. Am. reference as well:

    http://www.geosc.psu.edu/~kasting/PersonalPage/Pdf/Scientific_American_88.pdf

    Comment by Hank Roberts — 4 Oct 2008 @ 4:01 PM

  606. I’m not seeking a definition, Hank. No dictionaries required. I’m asking if Earth’s climate was running away at the time of the continental collision. A reply would begin with, say, “affirmative” or “negative”, and then supply a reason. Have you read all the references you cite? What do they say on this question?

    Comment by Richard Sycamore — 4 Oct 2008 @ 10:28 PM

  607. #602 Richard Sycamore,
    There was no runaway change under way at the time, just amplifications of any changes in forcings. There was plenty of time for the climate to reach new equilibria after any change in carbon dioxide levels. Most of the feedbacks are very quick on a geological time scale but some are not so on ours.

    Changes in radiation flux due to changes in water vapour and clouds occur in weeks. The consequent temperature changes can take decades.

    Carbon dioxide can take centuries to reach new equilibria and can take tens of thousands of years to return completely to previous levels once a source has stopped emitting. On a geological scale this is quick.

    Comment by Lloyd Flack — 5 Oct 2008 @ 1:13 AM

  608. #607
    Thank you for an honest attempt to answer the question. This more or less agrees with the fictional model I have in mind. But the question is: what is the basis for this model? How do you know that the equilibrium point was rising prior to the continental collision, and that it was not the result of a runaway event across a tipping point? And how do you know that the equilibrium point would not have continued to rise were it not for continental collision? Are these just guesses? Or are they strong inferences from a megayear-long GCM run?

    Comment by Richard Sycamore — 5 Oct 2008 @ 10:24 AM

  609. > what is the basis for this model?
    Wrong assumption. It’s not a model.

    > Are these just guesses?
    No.

    > are they strong inferences from a megayear-long GCM run?
    No.

    > How do you know that the equilibrium point
    Wrong assumption. Variation, within a livable (so far) range.

    Yes, I read each of the pages I suggested to you, yesterday before I posted them. Three pages. You can do it too. I have confidence.

    > What do they say on this question?

    You can look it up.

    Otherwise all you have is some guy on a blog as a source, and can go on arguing endlessly.

    You wouldn’t like doing that, would you?

    Comment by Hank Roberts — 5 Oct 2008 @ 10:48 AM

  610. #609
    The Sci. Am. paper dates to 1988. But Ruddiman’s hypothesis, cited by Hansen, is more recent than that. So rather than citing old science, why not just give me your modern-day synopsis? You can do it. I have confidence.

    Comment by Richard Sycamore — 5 Oct 2008 @ 1:18 PM

  611. http://www.realclimate.org/index.php?p=272
    5 July 2006 Runaway tipping points of no return
    Gavin’s several-paragraph introduction there sums it up well.
    Read it all, don’t rely on snippets posted by some guy on a blog:

    “People often conclude that the existence of positive feedbacks must imply ‘runaway’ effects i.e. the system spiralling out of control. However, while positive feedbacks are obviously necessary for such an effect, they do not by any means force that to happen. Even in simple systems, small positive feedbacks can lead to stable situations as long as the ‘gain’ factor is less than one (i.e. for every initial change in the quantity, the feedback change is less than the original one). A simple example leads to a geometric series … This series converges if |r|<1, and diverges (‘runs away’) otherwise. You can think of the Earth’s climate (unlike Venus’) as having an ‘r‘ less than one, i.e. no ‘runaway’ effects, but plenty of positive feedbacks.”

    Comment by Hank Roberts — 5 Oct 2008 @ 1:54 PM

  612. #611
    Hank, your presumption that I need a primer is incorrect. My question is well-posed and quite specific; there is no need to cite glossaries or spurious textbook material. If you can not answer it, that is fine, please let someone else try. Thanks.

    Comment by Richard Sycamore — 5 Oct 2008 @ 2:42 PM

  613. > no need …

    I haven’t cited either a glossary or a textbook; either might help.

    Best of luck, I’m done.

    Comment by Hank Roberts — 5 Oct 2008 @ 7:40 PM

  614. Gavin,
    What is the story behind the non-launch of DSCOVR? Internal NASA politics,outside political pressure or what?

    Also isn’t another similar satellite in the L2 position required as well to observe the night time changes? After all much of the trend comes from the reduction of the re-radiation of heat at night.

    Comment by Lloyd Flack — 6 Oct 2008 @ 1:32 AM

  615. Some source material:

    http://mitchellanderson.blogspot.com/2008/03/revealed-bush-killed-dscovr-mission.html
    (FOIA results)

    http://www.scribd.com/doc/2318466/Scientis-Letters-Only

    http://www.sciencemag.org/cgi/content/citation/311/5762/775c

    L2 is
    – about 4x as far as the Moon, and
    – would look at Earth directly against the background of the Sun, not the best way to get consistent readings of Earth’s night side.

    Comment by Hank Roberts — 6 Oct 2008 @ 4:02 PM

  616. Hey, good news! I’m wrong, smart people knew better, and plans exist:

    L-1 and L-2 observatories for Earth science in the post-2010 era
    Wiscombe, W.; Herman, J.; Valero, F.
    Geoscience and Remote Sensing Symposium, 2002. IGARSS apos;02. 2002 IEEE International
    Volume 1, Issue , 2002 Page(s): 365 – 367 vol.1
    Digital Object Identifier 10.1109/IGARSS.2002.1025041
    Summary: Twin observatories 1.5 million km from Earth along the Earth-Sun line offer revolutionary possibilities for Earth observation and scientific progress.

    http://start1.jpl.nasa.gov/caseStudies/eao-l2.cfm

    The Sun-Earth L2 point is a location from which a spacecraft would see Earth permanently eclipse the Sun, leaving only a thin ring of sunlight (called the solar annulus) surrounding the planet. It’s a relatively stable position, thanks to the combined gravitational effects of the Sun and Earth. As Earth orbits the Sun, a spacecraft at L2 would remain about 1.5 million km (about one million miles) above Earth’s nightside.

    This configuration makes it a uniquely desirable place from which to study long-term changes in Earth’s atmosphere. Spectrometers stationed at L2 would enable scientists to analyze the atmosphere by observing its effect on the sunlight shining through it.

    Our task was to develop a concept for a space telescope mission to make a detailed study, from the L2 vantage point, of the atmosphere’s constituents and dynamics over a span of 10 years. The hypothetical launch date was set at 2025 to 2030, to allow time for needed technology improvements. …

    Comment by Hank Roberts — 6 Oct 2008 @ 7:53 PM

  617. Hank, et al: How much better is the spectrum data from the L1/L2 satellites than regular orbiting satellites?

    Comment by Rod B — 7 Oct 2008 @ 12:41 PM

  618. Rod and Hank,
    On L1 and L2, First, the desirability is that the satellites have a mostly constant view of Earth rotating beneath them. I say mostly constant, as the satellites actually orbit the L2/L1 point. The distance is also sufficiently large that the satellites have a full view of the hemisphere they focus on. There is also the advantage that you don’t have variations due to eclipse of the sun (i.e. pretty constant illumination).
    How good the spectral measurements are depends on the instruments, but the Lagrange points provide a good platform for instrument optimization. How good can they get? Well, the James Webb Space Telescope’s detectors will have 6 electron read noise (we hope)!

    FWIW, Discovr/Triana is now out of mothballs. Not sure exactly what will be done to its instrument suite, but it looks like it might fly in some form or another.

    Comment by Ray Ladbury — 7 Oct 2008 @ 1:19 PM

  619. Rod, recall I’m just some guy on a blog and know nothing (grin), I’m just pointing to sources I find.

    Short answer: a consistent location — a stable point of view — has been lacking, and will be invaluable in collecting satellite data on Earth’s climate. Many open questions could have been resolved by now. See various longer answers about the planned mission for Triana/DSCOVR, easy to find

    A shotgun search of Google here:
    http://www.google.com/search?q=satellite+data+various+sets+pieced+together

    Turns up, just as an example:

    PMOD vs ACRIM (part 2) 7/27/07 …piecing the various satellite data series together to form a single continuous data set is a rather complex process.
    tamino.wordpress.com/2007/07/27/pmod-vs-acrim-part-2/

    Comment by Hank Roberts — 7 Oct 2008 @ 3:41 PM

  620. PS — an article worth reading in full, on topic:

    http://www.geolsoc.org.uk/gsl/geoscientist/features/page2617.html

    Excerpt below only the closing paragraphs; read the article to understand why they reach this conclusion.
    ——-

    Current atmospheric carbon dioxide concentrations, at over 380ppm, already exceed the peak level of carbon dioxide during past interglacials that we have measured (from air bubbles trapped in ice cores by the European Project for Ice Coring in Antarctica (EPICA)) going back over 400,000). Consequently, even though the full climate impact of this greenhouse ‘climate forcing’ has not yet become manifest, it has been delivered.

    When might we expect to see massive methane hydrate dissociation? This is a good question but (currently) a complete unknown. One thing we do know is that it is unlikely to be before the Earth warms in excess of the last glacial maximum; we may therefore have a few decades of complete safety left to us, at the very least. Second, given that oceanic mixing times are in the order of centuries, it may be we have a few centuries to go at the most. (Though remember: the first 300m or so of the ocean have already begun to warm – the fuse is lit.)

    The IPCC 2007 Assessment skates around the early Eocene-analogue issues. Its chapter on palaeoclimates does have a subsection on the early Eocene and the chapter on the atmosphere does cover the present CIE due to fossil fuel release and deforestation. However otherwise the IPCC do not connect the two. The closest it comes to making a definitive statement on the Eocene event as an analogue is:

    “Although there is still too much uncertainty in the data to derive a quantitative estimate of climate sensitivity from the PETM [Palaeocene Eocene Thermal Maximum], the event is a striking example of massive carbon release and related extreme climatic warming.”

    Clearly the implications for future research are considerable. Research into the biosphere processes of the early Eocene (or the Toarcian, another CIE event) simply has not had anything like the multi-million pound investment that other areas of climate science have been afforded. If it is not in the literature then it is not in the IPCC assessments. However now that those few scientists who have worked on the Eocene (and Toarcian) have done enough to demonstrate that past CIEs events very likely represent palaeoanalogues to the warming we are currently inducing, we now need to grapple with the detail.

    All this will come as no surprise to Geoscientist readers, because last year some members of the Stratigraphy Commission flagged Eocene and Toarcian climate analogues as a priority for policy makers. Perhaps it is time to consider the IETM as ‘a striking example’ that warrants more than a brief subsection within an IPCC assessment chapter?

    Comment by Hank Roberts — 18 Oct 2008 @ 10:52 PM

  621. Yeek!

    http://www.nature.com/nature/journal/v427/n6970/abs/nature02172.html

    Letters to Nature

    Nature 427, 142-144 (8 January 2004) | doi:10.1038/nature02172; Received 16 June 2003; Accepted 5 November 2003
    Critically pressured free-gas reservoirs below gas-hydrate provinces

    Matthew J. Hornbach1, Demian M. Saffer1 and W. Steven Holbrook1

    1. Department of Geology and Geophysics, University of Wyoming, Laramie, Wyoming 82071, USA

    Correspondence to: Matthew J. Hornbach1 Email: mhornbac@uwyo.edu
    Top of page

    Palaeoceanographic data have been used to suggest that methane hydrates play a significant role in global climate change. The mechanism by which methane is released during periods of global warming is, however, poorly understood1. In particular, the size and role of the free-gas zone below gas-hydrate provinces remain relatively unconstrained, largely because the base of the free-gas zone is not a phase boundary and has thus defied systematic description. Here we evaluate the possibility that the maximum thickness of an interconnected free-gas zone is mechanically regulated by valving caused by fault slip in overlying sediments2. Our results suggest that a critical gas column exists below most hydrate provinces in basin settings, implying that these provinces are poised for mechanical failure and are therefore highly sensitive to changes in am-bi-ent conditions3. We estimate that the global free-gas reservoir may contain from one-sixth to two-thirds of the total methane trapped in hydrate4. If gas accumulations are critically thick along passive continental slopes, we calculate that a 5 °C temperature increase at the sea floor could result in a release of approx. 2,000 Gt of methane from the free-gas zone, offering a mechanism for rapid methane release during global warming events.
    ——-
    Cited by ten more recent articles:
    http://www.nature.com/cited/cited.html?doi=10.1038/nature02172

    ——hyphens added—-

    Comment by Hank Roberts — 30 Oct 2008 @ 6:32 PM

  622. Some of the citing articles are from 2008. Is the concept of “valving” as new to the professionals as it seems to be here?
    I’d always had the notion that methane was expected to trickle out slowly with slow temperature change. But this seems … different.

    Comment by Hank Roberts — 4 Nov 2008 @ 4:44 PM

  623. Just a short question from an interested party (definitely a beginner) – did the temperature of the sun rise before the fairly recent switch of the poles? If the magnetic poles are meandering at a geologically rapid pace, would this have any effect on the temperature of the earth’s core?

    Comment by jay — 4 Nov 2008 @ 8:12 PM

Sorry, the comment form is closed at this time.

Close this window.

1.445 Powered by WordPress