RealClimate logo

Technical Note: Sorry for the recent unanticipated down-time, we had to perform some necessary updates. Please let us know if you have any problems.

FAQ on climate models

Filed under: — group @ 3 November 2008 - (Svenska)

We discuss climate models a lot, and from the comments here and in other forums it’s clear that there remains a great deal of confusion about what climate models do and how their results should be interpreted. This post is designed to be a FAQ for climate model questions – of which a few are already given. If you have comments or other questions, ask them as concisely as possible in the comment section and if they are of enough interest, we’ll add them to the post so that we can have a resource for future discussions. (We would ask that you please focus on real questions that have real answers and, as always, avoid rhetorical excesses).

Part II is here.

Quick definitions:

  • GCM – General Circulation Model (sometimes Global Climate Model) which includes the physics of the atmosphere and often the ocean, sea ice and land surface as well.
  • Simulation – a single experiment with a GCM
  • Initial Condition Ensemble – a set of simulations using a single GCM but with slight perturbations in the initial conditions. This is an attempt to average over chaotic behaviour in the weather.
  • Multi-model Ensemble – a set of simulations from multiple models. Surprisingly, an average over these simulations gives a better match to climatological observations than any single model.
  • Model weather – the path that any individual simulation will take has very different individual storms and wave patterns than any other simulation. The model weather is the part of the solution (usually high frequency and small scale) that is uncorrelated with another simulation in the same ensemble.
  • Model climate – the part of the simulation that is robust and is the same in different ensemble members (usually these are long-term averages, statistics, and relationships between variables).
  • Forcings – anything that is imposed from the outside that causes a model’s climate to change.
  • Feedbacks – changes in the model that occur in response to the initial forcing that end up adding to (for positive feedbacks) or damping (negative feedbacks) the initial response. Classic examples are the amplifying ice-albedo feedback, or the damping long-wave radiative feedback.


  • What is the difference between a physics-based model and a statistical model?

    Models in statistics or in many colloquial uses of the term often imply a simple relationship that is fitted to some observations. A linear regression line through a change of temperature with time, or a sinusoidal fit to the seasonal cycle for instance. More complicated fits are also possible (neural nets for instance). These statistical models are very efficient at encapsulating existing information concisely and as long as things don’t change much, they can provide reasonable predictions of future behaviour. However, they aren’t much good for predictions if you know the underlying system is changing in ways that might possibly affect how your original variables will interact.

    Physics-based models on the other hand, try to capture the real physical cause of any relationship, which hopefully are understood at a deeper level. Since those fundamentals are not likely to change in the future, the anticipation of a successful prediction is higher. A classic example is Newton’s Law of motion, F=ma, which can be used in multiple contexts to give highly accurate results completely independently of the data Newton himself had on hand.

    Climate models are fundamentally physics-based, but some of the small scale physics is only known empirically (for instance, the increase of evaporation as the wind increases). Thus statistical fits to the observed data are included in the climate model formulation, but these are only used for process-level parameterisations, not for trends in time.

  • Are climate models just a fit to the trend in the global temperature data?

    No. Much of the confusion concerning this point comes from a misunderstanding stemming from the point above. Model development actually does not use the trend data in tuning (see below). Instead, modellers work to improve the climatology of the model (the fit to the average conditions), and it’s intrinsic variability (such as the frequency and amplitude of tropical variability). The resulting model is pretty much used ‘as is’ in hindcast experiments for the 20th Century.

  • Why are there ‘wiggles’ in the output?

    GCMs perform calculations with timesteps of about 20 to 30 minutes so that they can capture the daily cycle and the progression of weather systems. As with weather forecasting models, the weather in a climate model is chaotic. Starting from a very similar (but not identical) state, a different simulation will ensue – with different weather, different storms, different wind patterns – i.e different wiggles. In control simulations, there are wiggles at almost all timescales – daily, monthly, yearly, decadally and longer – and modellers need to test very carefully how much of any change that happens because of a change in forcing is really associated with that forcing and how much might simply be due to the internal wiggles.

  • What is robust in a climate projection and how can I tell?

    Since every wiggle is not necessarily significant, modellers need to assess how robust particular model results are. They do this by seeing whether the same result is seen in other simulations, with other models, whether it makes physical sense and whether there is some evidence of similar things in the observational or paleo record. If that result is seen in multiple models and multiple simulations, it is likely to be a robust consequence of the underlying assumptions, or in other words, it probably isn’t due to any of the relatively arbitrary choices that mark the differences between different models. If the magnitude of the effect makes theoretical sense independent of these kinds of model, then that adds to it’s credibility, and if in fact this effect matches what is seen in observations, then that adds more. Robust results are therefore those that quantitatively match in all three domains. Examples are the warming of planet as a function of increasing greenhouse gases, or the change in water vapour with temperature. All models show basically the same behaviour that is in line with basic theory and observations. Examples of non-robust results are the changes in El Niño as a result of climate forcings, or the impact on hurricanes. In both of these cases, models produce very disparate results, the theory is not yet fully developed and observations are ambiguous.

  • How have models changed over the years?

    Initially (ca. 1975), GCMs were based purely on atmospheric processes – the winds, radiation, and with simplified clouds. By the mid-1980s, there were simple treatments of the upper ocean and sea ice, and clouds parameterisations started to get slightly more sophisticated. In the 1990s, fully coupled ocean-atmosphere models started to become available. This is when the first Coupled Model Intercomparison Project (CMIP) was started. This has subsequently seen two further iterations, the latest (CMIP3) being the database used in support of much of the model work in the IPCC AR4. Over that time, model simulations have become demonstrably more realistic (Reichler and Kim, 2008) as resolution has increased and parameterisations have become more sophisticated. Nowadays, models also include dynamic sea ice, aerosols and atmospheric chemistry modules. Issues like excessive ‘climate drift’ (the tendency for a coupled model to move away from the a state resembling the actual climate) which were problematic in the early days are now much minimised.

  • What is tuning?

    We are still a long way from being able to simulate the climate with a true first principles calculation. While many basic aspects of physics can be included (conservation of mass, energy etc.), many need to be approximated for reasons of efficiency or resolutions (i.e. the equations of motion need estimates of sub-gridscale turbulent effects, radiative transfer codes approximate the line-by-line calculations using band averaging), and still others are only known empirically (the formula for how fast clouds turn to rain for instance). With these approximations and empirical formulae, there is often a tunable parameter or two that can be varied in order to improve the match to whatever observations exist. Adjusting these values is described as tuning and falls into two categories. First, there is the tuning in a single formula in order for that formula to best match the observed values of that specific relationship. This happens most frequently when new parameterisations are being developed.

    Secondly, there are tuning parameters that control aspects of the emergent system. Gravity wave drag parameters are not very constrained by data, and so are often tuned to improve the climatology of stratospheric zonal winds. The threshold relative humidity for making clouds is tuned often to get the most realistic cloud cover and global albedo. Surprisingly, there are very few of these (maybe a half dozen) that are used in adjusting the models to match the data. It is important to note that these exercises are done with the mean climate (including the seasonal cycle and some internal variability) – and once set they are kept fixed for any perturbation experiment.

  • How are models evaluated?

    The amount of data that is available for model evaluation is vast, but falls into a few clear categories. First, there is the climatological average (maybe for each month or season) of key observed fields like temperature, rainfall, winds and clouds. This is the zeroth order comparison to see whether the model is getting the basics reasonably correct. Next comes the variability in these basic fields – does the model have a realistic North Atlantic Oscillation, or ENSO, or MJO. These are harder to match (and indeed many models do not yet have realistic El Niños). More subtle are comparisons of relationships in the model and in the real world. This is useful for short data records (such as those retrieves by satellite) where there is a lot of weather noise one wouldn’t expect the model to capture. In those cases, looking at the relationship between temperatures and humidity, or cloudiness and aerosols can give insight into whether the model processes are realistic or not.

    Then there are the tests of climate changes themselves: how does a model respond to the addition of aerosols in the stratosphere such as was seen in the Mt Pinatubo ‘natural experiment’? How does it respond over the whole of the 20th Century, or at the Maunder Minimum, or the mid-Holocene or the Last Glacial Maximum? In each case, there is usually sufficient data available to evaluate how well the model is doing.

  • Are the models complete? That is, do they contain all the processes we know about?

    No. While models contain a lot of physics, they don’t contain many small-scale processes that more specialised groups (of atmospheric chemists, or coastal oceanographers for instance) might worry about a lot. Mostly this is a question of scale (model grid boxes are too large for the details to be resolved), but sometimes it’s a matter of being uncertain how to include it (for instance, the impact of ocean eddies on tracers).

    Additionally, many important bio-physical-chemical cycles (for the carbon fluxes, aerosols, ozone) are only just starting to be incorporated. Ice sheet and vegetation components are very much still under development.

  • Do models have global warming built in?

    No. If left to run on their own, the models will oscillate around a long-term mean that is the same regardless of what the initial conditions were. Given different drivers, volcanoes or CO2 say, they will warm or cool as a function of the basic physics of aerosols or the greenhouse effect.

  • How do I write a paper that proves that models are wrong?

    Much more simply than you might think since, of course, all models are indeed wrong (though some are useful – George Box). Showing a mismatch between the real world and the observational data is made much easier if you recall the signal-to-noise issue we mentioned above. As you go to smaller spatial and shorter temporal scales the amount of internal variability increases markedly and so the number of diagnostics which will be different to the expected values from the models will increase (in both directions of course). So pick a variable, restrict your analysis to a small part of the planet, and calculate some statistic over a short period of time and you’re done. If the models match through some fluke, make the space smaller, and use a shorter time period and eventually they won’t. Even if models get much better than they are now, this will always work – call it the RealClimate theory of persistence. Now, appropriate statistics can be used to see whether these mismatches are significant and not just the result of chance or cherry-picking, but a surprising number of papers don’t bother to check such things correctly. Getting people outside the, shall we say, more ‘excitable’ parts of the blogosphere to pay any attention is, unfortunately, a lot harder.

  • Can GCMs predict the temperature and precipitation for my home?

    No. There are often large variation in the temperature and precipitation statistics over short distances because the local climatic characteristics are affected by the local geography. The GCMs are designed to describe the most important large-scale features of the climate, such as the energy flow, the circulation, and the temperature in a grid-box volume (through physical laws of thermodynamics, the dynamics, and the ideal gas laws). A typical grid-box may have a horizontal area of ~100×100 km2, but the size has tended to reduce over the years as computers have increased in speed. The shape of the landscape (the details of mountains, coastline etc.) used in the models reflect the spatial resolution, hence the model will not have sufficient detail to describe local climate variation associated with local geographical features (e.g. mountains, valleys, lakes, etc.). However, it is possible to use a GCM to derive some information about the local climate through downscaling, as it is affected by both the local geography (a more or less given constant) as well as the large-scale atmospheric conditions. The results derived through downscaling can then be compared with local climate variables, and can be used for further (and more severe) assessments of the combination model-downscaling technique. This is however still an experimental technique.

  • Can I use a climate model myself?

    Yes! There is a project called EdGCM which has a nice interface and works with Windows and lets you try out a large number of tests. ClimatePrediction.Net has a climate model that runs as a screensaver in a coordinated set of simulations. GISS ModelE is available as a download for Unix-based machines and can be run on a normal desktop. NCAR CCSM is the US community model and is well-documented and freely available.

464 Responses to “FAQ on climate models”

  1. 351
    Mark says:

    further on 351. The events CAN be subsituted on a statistical model.

    Have we turned it back to the thread topic now, Hank?

  2. 352
    Nico says:

    I noticed the Hansen et al article today about the 350 ppm being already a danger zone scenario. What do you guys from real climate say about it and what are the differences to other models/articles saying that 450ppm is the critical threshold and what are the implications for stabilization efforts of the EU, (possibly) the US and other countries in your opinion?


    [Response: We discussed this a while back. - gavin]

  3. 353
    Hank Roberts says:

    – A FAQ about the importance of rate of change (and rate of change OF rate of change, and the need to at least understand what calculus can explain if not how to do it).

    – A FAQ item for ocean acidification/ph change.
    Relevant to the need for cross-disciplinary work, too.

    This may help:

    “… expect to see changes in pH that are three times greater and 100 times faster than those experienced during the transitions from glacial to interglacial periods. Such large changes in ocean pH have probably not been experienced on the planet for the past 21 million years.”

    “In May 2004, SCOR and UNESCO-IOC co-hosted an international symposium to address these issues …. the two organizations agreed to make this symposium a regular event to be held every 4 years.

    “… joined by two new international organizations: the International Atomic Energy Agency’s Marine Environmental Laboratory and the International Geosphere-Biosphere Programme, enhancing links to the UN system and to interdisciplinary Earth science.

    “This web-site is a follow-up of the first symposium and is meant to provide a central source of information for ocean scientists on research activities in this area.”

  4. 354
    David Wojick says:

    Regarding the FAQ “How have models changed over the years?” Saying that “model simulations have become demonstrably more realistic” and citing Reichler and Kim 2008 to prove it is overstating the case, to say the least.

    The Reichler paper is a study in strange statistics. They combine a large number of climate related state variables into a single number, like the Dow index. Then they show that one of the latest ensembles comes marginally closer to matching this number, for a single observed state of the climate, than did some previous generation ensembles. This index number bears no relation to the climate parameters of actual interest, other than containing them. Nor is there any consideration of trend matching.

    The reason they did not look at matching important model values with observation is because there has been no improvement. Same for trends. As for forecasts, the models are still all over the map. The fact that the range of temperature forecasts has not converged is actually a major research area. Many people think the unrealistic, high end forecasts are due to runaway positive feedbacks, in the models not in reality. But of course it is just these feedbacks that yield the forecasts of dangerous warming. All things considered the claim that the models have become “demonstrably more realistic” is unrealistic. Ironically, the Reichler paper begins by citing some papers to this effect.

  5. 355
    David Wojick says:

    The discussion of chaotic “wiggles” in the FAQ is puzzling and probably wrong. The FAQ begins with a correct observation, namely that “there are wiggles at almost all timescales – daily, monthly, yearly, decadally and longer…” But it then goes on to talk as though these wiggles are something to ignore, or to average out using ensembles. Neither is true.

    Climate is a chaotic system and the actual climate is not an ensemble. Decadal and longer internal oscillations could explain the temperature record of the last 100 years. The oscillations were relatively uniform — temps rose for the first third of the last century, then fell of the second third, then rose again for the third third. Now they are falling again. Chaotic systems are notoriously immune to forcing and the climate is one. These wiggles are something the modelers should be studying, not brushing aside.

    [Response: It just ain't so. How does that chaotic system respond to the seasonal forcing? Or is that just a figment of our imagination as well? Or orbital cycles? or volcanoes? There are loads of examples demonstrating that that there is a predictable forced response of the climate to forcings. That isn't to say that the wiggles aren't worth looking at - of course they are, but your implication that the presence of chaos implies that there can't be a predictable forced component is simply fallacious. - gavin]

  6. 356
    jcbmack says:

    Final recommended readings: Global Warming and Global Politics by Matthew Paterson and Climate Change in Prehistory by William James Burroughs.

  7. 357
    Stefano says:

    Ray Ladbury wrote:

    It seems to me that the precuationary principle applies to the scientific side of the argument–we know human civilization, with all its complicated infrastructure functions with CO2 in the 280-350 ppmv range. If you are going to move outside of that range, you damned well better show that it will not affect infrastructure catastrophically.
    What is more, we know for an absolute fact that energy infrastructure MUST change due to peak oil.

    Ray, CO2 is just one variable. On the subject of catastrophes, what about overpopulation? This is the mother of all problems. What if we need to quickly stop population growth? People in the most developed countries have the fewest children. It follows that to stop population growth we need the developing world to grow their standard of living as fast as possible–in as few generations as possible. You don’t want four kids having four kids having four kids. You want four kids to have two, and them to have just one. In the meantime to fuel that growth, CO2 levels may go up quite a bit, as coal and oil are quick to obtain, but you’ll have done something arguably far more important, which is to urgently and rapidly reach the point where we can reduce our numbers. Then everything starts returning to balance.

    Can we focus on emissions cuts and rapid development at the same time? Many governments seem to think not. Many think they should use their coal. There is a Japanese saying; “chase two rabbits, catch none”.

  8. 358
    jcbmack says:

    Seismologists are trying to gain a a second or two warning of the S wave from earthquakes to assist people in getting under the table. Japan has better warning systems as most of their earthquakes start offshore. Early detection is a tricky thing, regarding earthquakes.

  9. 359
    Rod B says:

    Mark (327), ice albedo (change) and ocean currents strike me as more feedback stuff than forcings. Though I suppose a modeler can define any forcing he chooses and still be correct within his context and algorithms. Nor do I see that it matters given my question re grouping all forcings into two neat families which was pretty much answered as unhelpful, likely wrong, and probably silly. I do understand feedback vs. forcing. What led you to believe I didn’t. And what on earth is the contention here; I’m totally missing our “disagreement”…

  10. 360

    How about a bibliographic FAQ?–people always want sources, and while the bibliography is doubtless out there, the layperson has a tough time finding the really crucial ones amidst the merely workmanlike. Maybe top 25 (or 40? 100?) historical climatological papers? It could certainly help demonstrate the historical depth of the science.

    (Captcha seems to think it’s a hot idea: “Combustion waves.”)

    [Response: The AIP: Discovery of global warming site has a good bibliogrphy (linked on the sidebar). But there might be something to this... I'll think about it. - gavin]

  11. 361
    Phil Scadden says:

    Stefano – if you dont cut CO2, then the climate change is going to deliver a number of unpleasant ways to bring population down. If you want to give developing countries time to breath, then developed world is the place that needs to bring down CO2 emissions, fast and low.

  12. 362

    Can someone please answer the following questions about extreme climate events?
    Do you know if anyone has performed rough calculations of the effect on Earth’s average temperature of the following?

    1. Direct effect on Earth’s average temperature of complete permanent loss of Arctic ice-cap for half the year.

    2. Release of all CH4 and CO2 stored in permafrost both on land and under the Arctic Ocean bed (order of magnitude).

    3. I understand that there is not a linear relationship between CO2 concentration and Earth’s average temperature; that as CO2 concentration increases, the marginal effect on temperature decreases due to saturation of spectral lines. Does this mean that, under positive feedback processes that release very large quantities of CO2 into the atmosphere, there is a limit to the increase in the average temperature of Earth? If so, is this limit of the order of 10, 100 or 1000 degrees C? If the limit is of the order of 10 degrees C, then how is it possible that Venus has such high temperatures, much higher tan its distance from the Sun would explain?

  13. 363
    Lawrence Coleman says:

    I just heard on the news yesterday that latest oceanographic research (I didn’t catch the source unfortunately- my 3y/o son was having a tantrum) says global crustation stocks will collapse by 2030 if the acidicfication and warming of the oceans continue to rise. That’s 50 years ahead of the IPCC forcasts.
    If that was to occur in my understanding then global fisheries will also collapse leading to world wide famine expecially of the asian countries. Is this so?

    [Response: Famine might be a little strong, but the health of many fisheries is certainly in a parlous state. - gavin]

  14. 364
    Hank Roberts says:

    Dr. D., try these for the answers to your third question.
    (found with Google, searching: runaway climate realclimate )

    Lessons from Venus:

    Recall Earth was apparently whacked hard by something about the size of Mars early on; that changed a primordial near twin of Venus, with a more typical amount of CO2 in its planetary atmosphere, to what we live on (Earth) and under (the Moon). Quite different.

  15. 365
  16. 366
    Chris Colose says:

    Dr. Mark Diesendorf

    I can only answer bits of 2 and 3

    Regarding #2, this paper has 1.6 Trillion metric tons of Carbon locked up in NH permafrost, but that is one area (deep soils in northeastern Siberia). They also say, “global soil C stocks from 0 to 3m depth (peatlands not included) have been estimated to be 2300 Pg (Jobbágy and Jackson 2000).” A “Pg” to represent billion metric ton.

    On #3, After very large concentrations of CO2 (such as when most of the atmosphere is CO2 enriched) the logarithmic behavior no longer holds and sensitivity is increased; the logarithmic rule is mainly applicable to conditions relevant on Earth. What’s more, after very high pressures, collision-induced broadening becomes a relevant factor which is different than the normal absoprtion associated with the spectral lines. Adding more greenhouse gases (even on Venus) will always produce some extra warming insofar as the temperature continues to decrease in the vertical (the lapse rate).

    In any case– The Kombayashi-Ingersoll limit gives a quantiative outlook on the runaway greenhouse which depends on acceleration due to gravity at the surface, how much sunlight you’re getting, etc. The thing is that when you put sufficient greenhouse gases in the atmosphere, the saturation vapor pressure puts a limit on how much stays there before it condenses out. On Earth, water vapor cannot accumulate anymore since it will just form a cloud and precipitate as snow or rain. Increasing temperature from extra CO2 does raise the saturation pressure and thus you get a bit of extra water vapor (a positive feedback), but it’s small compared to the Venus situation. Unlike Earth where most of the water is liquid (the ocean), on Venus, any input of extra CO2 or water vapor stays in the atmosphere (or goes to space). This kind of situation where the ocean boils away cannot happen here until the sun brigtens sufficiently which would take a billion years or longer, and the CO2 reservoir will be released until its supply is diminished. And it will be very much like Venus….

  17. 367
    Mark says:

    RodB, 359. I thought it strange. You normally pick something less silly to be contrarian about.

    If you meant your (and others question about “what is forcings” was unhelpful, wrong and probably silly, you’ve hit the problem right on its head.

    You’re right as well that a model without sea ice modelled within has to be a forcing. It’s why the Sun is taken as one. The forcing is purely one way. A warmer earth doesn’t heat the Sun. I know there are newer models that model sea ice and likely land ice too.

    Hope that clears it up.

    The disagreement was over the idea that “what is a forcing” is an unhelfpul question. It cannot be answered except in the specific where it isn’t a question whose answer you can take out of that model usefully.

    NB: My goodness, the oracle does seem to be strange today: Household Nostrils?

  18. 368
    Mark says:

    Stefano, 357, which is why CO2 equivalent is the “450 maximum” limit. Includes the climatological effects of other gasses.

    As to overpopulation, only a command government like china can tell people to stop boffing. NO government will start a cull.

    So how do you suggest we get people to die off?

    Heck, economically that is stupid. Who is going to pay for the pensions? That only comes from a higher GDP being paid off to people who no longer work. If you don’t have more people, that means inflationary economy is mandatory unless we kill people off on retirement.

    Worse, the West is more densely populated in general and definitely more of a stress on the capabilities of the region they live in to support them (water aquifers being drained and not replenished is more common in the modernised west so indicating that they are exceeding the resources available). So the cull MUST take place here in the West rather than in the more populous (but less stressful on resources) poorer nations.

    Fancy shuffling off this mortal coil for the sake of the planet?

    Didn’t think so. Population control is for “other people” not you. Problem is, everyone thinks that.

  19. 369
    Mark says:

    Glen, regarding the past colder sun, read comment #31 in “Adapting to Amsterdam”.


  20. 370
    Vincent van der Goes says:

    A simple question – apologies if it has already been asked.

    On what timescales and scenarios are the models typically tested? For example, some interesting scenarios I could think of are:

    - simulating the events from 1900 up to now
    - simulating the events from 100.000 years ago up to the start of the industrial revolution
    - simulating the climate over a timescale of several hundred million years

    Have model runs so far been mainly concerned with one type of scenario?

    [Response: Transient runs from the 19th Century are common. From 1000AD to the present is becoming more so. For earlier periods (last ice age etc), models generally use time-slice experiments - these have been done for many different periods. - gavin]

  21. 371
    Barton Paul Levenson says:

    Hank Roberts writes:

    Recall Earth was apparently whacked hard by something about the size of Mars early on; that changed a primordial near twin of Venus, with a more typical amount of CO2 in its planetary atmosphere, to what we live on (Earth) and under (the Moon). Quite different.

    I think the runaway greenhouse on Venus is generally attributed to its proximity to the sun. Narrow models of the sun’s continuously habitable zone (CHZ) put the inner boundary at 0.95 AUs. Kasting’s most generous estimate puts it at 0.85 AUs. Venus’s semimajor axis is 0.72333 AUs.

  22. 372
    Ray Ladbury says:

    Stefano #357: Saying we have to address EITHER climate or development is a false dichotomy. Both are facets of the same underlying issue–sustainability. If we address climate in the industrial (or post-industrial?) world, but ignore developing countries, they will burn whatever they can get their hands on to meet their needs and the climate will warm anyway. If we address development at the expense of climate, agricultural production will likely collapse and with it any economic gains from our efforts.
    The fact of Peak Oil means that things have to change. The only question is whether things change so that we address all the threats that confront us, or do we screw it up and do a half-assed job by ignoring development, climate change, water quality, soil depletion, or any of a number of threats that confront us.

  23. 373
    simon abingdon says:

    gavin #347 “I think you are overthinking” Which do you recommend?

    [Response: I get it now. This is a obviously a Turing test.... - gavin]

  24. 374
    JCH says:

    Ray, peak oil means things eventually have to change, but it does not mean that mankind will not burn most of the rest of it in a big hurry. Oil is easy to use. Oil, even at 147 a barrel, is cheap.

    Straws in the ground; suck that milkshake.

    At 60 a barrel, expect oil consumption behaviors to expand rapidly. Local GM plant is on full overtime. What do they build? Big pickups and SUVs.

  25. 375
    Ray Ladbury says:

    JCH, Yeah, I know. I really don’t think humanity is smart enough to avoid the “empirical” approach to answering the question of how climate change will affect civilization. Still, there’s a far cry from using up the rest of the oil to using all the oil + coal + tar sands + oil shale….The latter 3 are less easy to use, and require significant changes in infrastructure. The intelligent thing to do would be invest in new infrastructure that also addresses climate concerns, but then, it probably ought to tell us something that alien civilizations have given our planet wide berth in their search for intelligent life in the cosmos.

  26. 376
    David B. Benson says:

    Vincent van der Goes (370) — There is a paper by Abe et al. describing a GCM run from the Eemian interglacial (125,000 ya) to the peak of the Holocene (about 7000 ya).

    I found this most impressive.

  27. 377
    JCH says:

    The mantra of getting off foreign oil means tar sands in Canada and Venezuela are going to be exploited with gusto. Venezuela and Canada already ships significant quantities, and Venezuela is offering new partnerships at this very moment. So the straws in the tar sands already exist and lots of new ones are about to be plunged. It just an extra-thick milkshake.

    ExxonMobil and Venezuela are embroiled in a nasty legal battle. And every so often a ship full of Venezuelan tar shows up at a remote, well-hidden ExxonMobil refinery, built especially for that extra-thick milkshake, and the two enemies put down their motions and subpoenas and refine money.

  28. 378
    David Wojick says:

    Regarding the FAQ “What is robust in a climate projection and how can I tell?” Given that the results differ a lot from model to model, it follows from this FAQ that none of these results is robust. That the planet will warm some with large GHG increases may be relatively robust, but whether such warming will be significant is not, because the models disagree. Moreover the paleo record is ambiguous here, given that temperature seems to lead GHG rise. Nor is theory unequivocal because there is reason to believe that the models lack negative feedbacks found in nature. And of course if one or more of the alternative hypotheses currently under investigation turns out to be true then the model results are just wrong, robust or not.

  29. 379
    Hank Roberts says:

    That doesn’t make sense, David.

    What definition of “robust” are you using?

    I don’t see a definition on your “debate” blog but you’re not using the standard meaning (that one of a group of different data sources can be removed without changing the result from the study).

  30. 380
    Mark says:

    David, #378.

    There are robust outcomes:

    GW is mostly Anthropogenically caused.
    It is getting warmer.

    Oddly enough, through all the denailosphere, there are not robust projections or even causes that meet standards of being workable. I guess this means that you’ll be ignoring the denialists until they turn up with some concrete and robust outcomes themselves.

  31. 381
  32. 382
    Hank Roberts says:

    This could well go under “Start Here”

    John Mashey on how to learn about science

    Category: Global Warming
    Posted on: August 21, 2008 3:03 PM, by Tim Lambert

    Another post on John Mashey’s virtual blog. Everything that follows is from comments posted here by Mashey, lightly edited.

    This long essay grew from a dialog in this thread into something that may be a more general resource than just some answers to Mr Manny.

    I’m sure it’s mentioned, but not sure where.
    Excellent. Should be an e-book or something.

  33. 383
    Mary says:

    In case those of you living in the USA have not noticed, your computers and laptops are powered by coal. Your homes are heated by gas and your electric is powered by the same. There’s a slim chance that if our nation pulled in one direction that we could get a decent portion of our electric generation out of new nuke plants 6-10 years from now if we start building like crazy now. There is no “alternative” power source that will have any impact on providing the juice for your HD TV for 20 years. China commissions a new coal plant every week and you are not going to over-rule the denialists in China to change that. Let’s all just chill and accept that we’re going to pump out CO2 for the next 25-50 years until our economies are ready to transition away from petro as a power source.

  34. 384
    jcbmack says:

    Unfortunately Mark, that is tue in light of the current economic scheme of thingsl the technology exists and the ability to apply as well, but it takes money, time, and some companies losing out in the process and quirks in a newly applied system. Atleast pickens is doing something.

  35. 385
    jcbmack says:

    Oops I meant Mary, but keep in mind the technology and the ability to engineer it, already does exist and has for quite a few years now.

  36. 386
    Hank Roberts says:

    Mary, conservation is the biggest, fastest approach. You’re paraphrasing very pessimistic sources — where are you getting your numbers?

    Most ‘knowledgeable’ sources were shown to have grossly underestimated people’s willingness and readiness to conserve when called on for help.

    This isn’t hard and it saves rather than costs money:

    In related news:

    Governor Schwarzenegger today issued an executive order (S-13-08) requiring state agencies to assess and plan for rising sea levels caused by climate change….

    The nation’s oldest continuously operating sea level gauge, located at Fort Point in San Francisco, logged a seven-inch rise during the last century…. with a lax response to climate change, the Pacific could rise three times that much this century.

    The order goes out three days before Schwarzenegger hosts a Governors’ Climate Summit in Beverly Hills.

  37. 387
    Jim Eager says:

    Re David Wojick @378: “Moreover the paleo record is ambiguous here, given that temperature seems to lead GHG rise.”

    The paleo record is not at all ambiguous: as temperature rose in response to natural increases in insolation more GHGs were released into the atmosphere (mainly CO2, CH4, and H2O), and then those GHGs induced yet more warming, which released yet more GHGs, etc., until equilibrium–and a warmer climate–was reached.

    Today we’re just skipping that first natural warming step by injecting GHGs directly into the atmosphere, which is already inducing warming, which will result in yet more GHGs being released naturally, etc., until equilibrium–and a warmer climate–is reached.

    In fact, we’re already starting to see rising natural emissions of GHGs from natural sources in the form of CO2 and CH4 from thawing permafrost and methane clathrates, and a reduction in the ability of the ocean to absorb CO2. No ambiguity what so ever.

    “Nor is theory unequivocal because there is reason to believe that the models lack negative feedbacks found in nature.”

    On the contrary, the known negative feedbacks (clouds, aerosols) are in fact included in the models, although some are not yet modeled as well as they should be. In any case, the theory does not in any way rest on the models.

    “And of course if one or more of the alternative hypotheses currently under investigation turns out to be true then the model results are just wrong, robust or not.”

    And you are sure of this how? If one of the hypotheses currently under investigation should prove to have merit there is no way to predict in advance how much a factor it might be, and in any case, it would not alter the property of GHGs to absorb and emit LWR one iota.

    Inconvenient, that.

  38. 388
    Hank Roberts says:

    > sure of this how?

    ReCaptcha: 3/4 pocketbook

  39. 389
    Hank Roberts says:

    Just as an example of the information currently available:

    50 percent reduction scenarios:

    found here:

    Environ. Res. Lett. 3 (October-December 2008) 044002

    What do recent advances in quantifying climate and carbon cycle uncertainties mean for climate policy?

  40. 390
    John Mashey says:

    re: #386 Hank
    re: Schwarzenegger & sea level

    This, of course has been going on for a while.
    1-day conference earlier this year brought together representatives of local SF Bay Area governments on the topic, to hear about the science, discuss current and near-term plans, and run through longer-term planning exercises. The latter offered useful experiences in the political difficulties that will be encountered, given the amount of infrastructure at sea level, and the unsurprising wish of anyone that *their* property be behind any dikes that get built.

  41. 391
    Mark says:

    David, #378.

    Any how did the paleo creatures burn a trillion tons of carbon sequestered in fossil fuels? Because WE have done that, so if you want to use past data to refute current situations, you have to account for any differences.

    Also, the lag there is 600 years. So where is that 600-year-old change in temperature that is “causing” this increase in CO2 concentrations?

    I wonder where that “skeptic” RodB and his pals are, to unload their skepticism on your poor science?

  42. 392
    Hank Roberts says:

    Thank you John. I’ve been watching the landfill in the Bay west of Berkeley and Albany grow and thinking, whether intentional or not, the dikes are already being developed.


  43. 393
    Rod B says:

    Mark, I could probably ask David some probing questions, but your shots are doing just fine…

    Do the quotes imply I’m just a pseudo-skeptic? And what pals???

  44. 394
    Malcolm Tattersall says:

    Can I use a climate model myself? Not so far as I know, though I would like to, because I use a Mac. If you know of any that run on OS X, please include them.
    The Climate Institute has a model for calculating the effects of emission reductions, which may also be of interest:
    - but it also is Windows-only.

    [Response: Macs run on a Unix background, and so ModelE will run. You may need to find/purchase an appropriate F90 compiler. - gavin]

  45. 395
    Jim Cross says:

    There was an invitation given to us to ask questions which might be relevant to add to this FAQ.

    I have seen numerous questions offered.

    Is there a time frame for an updated FAQ?

    [Response: Soon! I am collating the questions that I think are interesting (along with some of the responses already offered, and will have a part II post up soon. Real world priorities have taken precedence on getting to this. Thanks for your patience! - gavin]

  46. 396
    Hank Roberts says:

    Malcolm, I haven’t tried this but you might look into it:
    EdGCM, or the Educational Global Climate Model

    (Google turned up a thread complaining it worked fine on the Macs but wasn’t behaving on Windows, for this one user; maybe info in there):

  47. 397
    Hank Roberts says:

    This presentation may suggest a way other than a posting_plus_comments format to present FAQ items.

    This one is read-only — a similar layout could link to old and new comment threads as well as to references and sources:

  48. 398
    Ron Taylor says:

    Mark, I find it almost humerous that you describe the efforts of the highly qualified climate scientists on this site as “poor science.” Would you care to explain specifically what you mean by that? Otherwise, please understand that you have just made yourself look very foolish. Hint: what you have said in your post #391 does not even come close.

  49. 399

    Malcolm, if you want to run a climate model at home on your Mac, you will see from ClimatePrediction’s (CPDN’s) application’s page that three of the four model types currently on offer should run on it. You’d first need to download the BOINC platform that the models run on from here: Models typically run for between a couple of weeks and three or four months if the computer processes them 24/7.

  50. 400
    Mark says:

    Ron Taylor, the “poor science” is from people like David and others. “It’s the cosmic rays I tells ya!”. Or “It’s been cooling these past eight years!”.

    Please explain how the paleoclimate that didn’t have fossil fuels burnt and had a 600 year lag from temperature rises to CO2 rises applies to this day where we can measure there are fossil fuel CO2 in the atmosphere and there is no 600 year old temperature rise that explains the CO2 in this size of change.

    This, was what my post was about.

    That’s the poor science.

    Now, did you read or did your knee get in the way?

Switch to our mobile site