RealClimate logo

FAQ on climate models

Filed under: — group @ 3 November 2008 - (Svenska)

We discuss climate models a lot, and from the comments here and in other forums it’s clear that there remains a great deal of confusion about what climate models do and how their results should be interpreted. This post is designed to be a FAQ for climate model questions – of which a few are already given. If you have comments or other questions, ask them as concisely as possible in the comment section and if they are of enough interest, we’ll add them to the post so that we can have a resource for future discussions. (We would ask that you please focus on real questions that have real answers and, as always, avoid rhetorical excesses).

Part II is here.

Quick definitions:

  • GCM – General Circulation Model (sometimes Global Climate Model) which includes the physics of the atmosphere and often the ocean, sea ice and land surface as well.
  • Simulation – a single experiment with a GCM
  • Initial Condition Ensemble – a set of simulations using a single GCM but with slight perturbations in the initial conditions. This is an attempt to average over chaotic behaviour in the weather.
  • Multi-model Ensemble – a set of simulations from multiple models. Surprisingly, an average over these simulations gives a better match to climatological observations than any single model.
  • Model weather – the path that any individual simulation will take has very different individual storms and wave patterns than any other simulation. The model weather is the part of the solution (usually high frequency and small scale) that is uncorrelated with another simulation in the same ensemble.
  • Model climate – the part of the simulation that is robust and is the same in different ensemble members (usually these are long-term averages, statistics, and relationships between variables).
  • Forcings – anything that is imposed from the outside that causes a model’s climate to change.
  • Feedbacks – changes in the model that occur in response to the initial forcing that end up adding to (for positive feedbacks) or damping (negative feedbacks) the initial response. Classic examples are the amplifying ice-albedo feedback, or the damping long-wave radiative feedback.


  • What is the difference between a physics-based model and a statistical model?

    Models in statistics or in many colloquial uses of the term often imply a simple relationship that is fitted to some observations. A linear regression line through a change of temperature with time, or a sinusoidal fit to the seasonal cycle for instance. More complicated fits are also possible (neural nets for instance). These statistical models are very efficient at encapsulating existing information concisely and as long as things don’t change much, they can provide reasonable predictions of future behaviour. However, they aren’t much good for predictions if you know the underlying system is changing in ways that might possibly affect how your original variables will interact.

    Physics-based models on the other hand, try to capture the real physical cause of any relationship, which hopefully are understood at a deeper level. Since those fundamentals are not likely to change in the future, the anticipation of a successful prediction is higher. A classic example is Newton’s Law of motion, F=ma, which can be used in multiple contexts to give highly accurate results completely independently of the data Newton himself had on hand.

    Climate models are fundamentally physics-based, but some of the small scale physics is only known empirically (for instance, the increase of evaporation as the wind increases). Thus statistical fits to the observed data are included in the climate model formulation, but these are only used for process-level parameterisations, not for trends in time.

  • Are climate models just a fit to the trend in the global temperature data?

    No. Much of the confusion concerning this point comes from a misunderstanding stemming from the point above. Model development actually does not use the trend data in tuning (see below). Instead, modellers work to improve the climatology of the model (the fit to the average conditions), and it’s intrinsic variability (such as the frequency and amplitude of tropical variability). The resulting model is pretty much used ‘as is’ in hindcast experiments for the 20th Century.

  • Why are there ‘wiggles’ in the output?

    GCMs perform calculations with timesteps of about 20 to 30 minutes so that they can capture the daily cycle and the progression of weather systems. As with weather forecasting models, the weather in a climate model is chaotic. Starting from a very similar (but not identical) state, a different simulation will ensue – with different weather, different storms, different wind patterns – i.e different wiggles. In control simulations, there are wiggles at almost all timescales – daily, monthly, yearly, decadally and longer – and modellers need to test very carefully how much of any change that happens because of a change in forcing is really associated with that forcing and how much might simply be due to the internal wiggles.

  • What is robust in a climate projection and how can I tell?

    Since every wiggle is not necessarily significant, modellers need to assess how robust particular model results are. They do this by seeing whether the same result is seen in other simulations, with other models, whether it makes physical sense and whether there is some evidence of similar things in the observational or paleo record. If that result is seen in multiple models and multiple simulations, it is likely to be a robust consequence of the underlying assumptions, or in other words, it probably isn’t due to any of the relatively arbitrary choices that mark the differences between different models. If the magnitude of the effect makes theoretical sense independent of these kinds of model, then that adds to it’s credibility, and if in fact this effect matches what is seen in observations, then that adds more. Robust results are therefore those that quantitatively match in all three domains. Examples are the warming of planet as a function of increasing greenhouse gases, or the change in water vapour with temperature. All models show basically the same behaviour that is in line with basic theory and observations. Examples of non-robust results are the changes in El Niño as a result of climate forcings, or the impact on hurricanes. In both of these cases, models produce very disparate results, the theory is not yet fully developed and observations are ambiguous.

  • How have models changed over the years?

    Initially (ca. 1975), GCMs were based purely on atmospheric processes – the winds, radiation, and with simplified clouds. By the mid-1980s, there were simple treatments of the upper ocean and sea ice, and clouds parameterisations started to get slightly more sophisticated. In the 1990s, fully coupled ocean-atmosphere models started to become available. This is when the first Coupled Model Intercomparison Project (CMIP) was started. This has subsequently seen two further iterations, the latest (CMIP3) being the database used in support of much of the model work in the IPCC AR4. Over that time, model simulations have become demonstrably more realistic (Reichler and Kim, 2008) as resolution has increased and parameterisations have become more sophisticated. Nowadays, models also include dynamic sea ice, aerosols and atmospheric chemistry modules. Issues like excessive ‘climate drift’ (the tendency for a coupled model to move away from the a state resembling the actual climate) which were problematic in the early days are now much minimised.

  • What is tuning?

    We are still a long way from being able to simulate the climate with a true first principles calculation. While many basic aspects of physics can be included (conservation of mass, energy etc.), many need to be approximated for reasons of efficiency or resolutions (i.e. the equations of motion need estimates of sub-gridscale turbulent effects, radiative transfer codes approximate the line-by-line calculations using band averaging), and still others are only known empirically (the formula for how fast clouds turn to rain for instance). With these approximations and empirical formulae, there is often a tunable parameter or two that can be varied in order to improve the match to whatever observations exist. Adjusting these values is described as tuning and falls into two categories. First, there is the tuning in a single formula in order for that formula to best match the observed values of that specific relationship. This happens most frequently when new parameterisations are being developed.

    Secondly, there are tuning parameters that control aspects of the emergent system. Gravity wave drag parameters are not very constrained by data, and so are often tuned to improve the climatology of stratospheric zonal winds. The threshold relative humidity for making clouds is tuned often to get the most realistic cloud cover and global albedo. Surprisingly, there are very few of these (maybe a half dozen) that are used in adjusting the models to match the data. It is important to note that these exercises are done with the mean climate (including the seasonal cycle and some internal variability) – and once set they are kept fixed for any perturbation experiment.

  • How are models evaluated?

    The amount of data that is available for model evaluation is vast, but falls into a few clear categories. First, there is the climatological average (maybe for each month or season) of key observed fields like temperature, rainfall, winds and clouds. This is the zeroth order comparison to see whether the model is getting the basics reasonably correct. Next comes the variability in these basic fields – does the model have a realistic North Atlantic Oscillation, or ENSO, or MJO. These are harder to match (and indeed many models do not yet have realistic El Niños). More subtle are comparisons of relationships in the model and in the real world. This is useful for short data records (such as those retrieves by satellite) where there is a lot of weather noise one wouldn’t expect the model to capture. In those cases, looking at the relationship between temperatures and humidity, or cloudiness and aerosols can give insight into whether the model processes are realistic or not.

    Then there are the tests of climate changes themselves: how does a model respond to the addition of aerosols in the stratosphere such as was seen in the Mt Pinatubo ‘natural experiment’? How does it respond over the whole of the 20th Century, or at the Maunder Minimum, or the mid-Holocene or the Last Glacial Maximum? In each case, there is usually sufficient data available to evaluate how well the model is doing.

  • Are the models complete? That is, do they contain all the processes we know about?

    No. While models contain a lot of physics, they don’t contain many small-scale processes that more specialised groups (of atmospheric chemists, or coastal oceanographers for instance) might worry about a lot. Mostly this is a question of scale (model grid boxes are too large for the details to be resolved), but sometimes it’s a matter of being uncertain how to include it (for instance, the impact of ocean eddies on tracers).

    Additionally, many important bio-physical-chemical cycles (for the carbon fluxes, aerosols, ozone) are only just starting to be incorporated. Ice sheet and vegetation components are very much still under development.

  • Do models have global warming built in?

    No. If left to run on their own, the models will oscillate around a long-term mean that is the same regardless of what the initial conditions were. Given different drivers, volcanoes or CO2 say, they will warm or cool as a function of the basic physics of aerosols or the greenhouse effect.

  • How do I write a paper that proves that models are wrong?

    Much more simply than you might think since, of course, all models are indeed wrong (though some are useful – George Box). Showing a mismatch between the real world and the observational data is made much easier if you recall the signal-to-noise issue we mentioned above. As you go to smaller spatial and shorter temporal scales the amount of internal variability increases markedly and so the number of diagnostics which will be different to the expected values from the models will increase (in both directions of course). So pick a variable, restrict your analysis to a small part of the planet, and calculate some statistic over a short period of time and you’re done. If the models match through some fluke, make the space smaller, and use a shorter time period and eventually they won’t. Even if models get much better than they are now, this will always work – call it the RealClimate theory of persistence. Now, appropriate statistics can be used to see whether these mismatches are significant and not just the result of chance or cherry-picking, but a surprising number of papers don’t bother to check such things correctly. Getting people outside the, shall we say, more ‘excitable’ parts of the blogosphere to pay any attention is, unfortunately, a lot harder.

  • Can GCMs predict the temperature and precipitation for my home?

    No. There are often large variation in the temperature and precipitation statistics over short distances because the local climatic characteristics are affected by the local geography. The GCMs are designed to describe the most important large-scale features of the climate, such as the energy flow, the circulation, and the temperature in a grid-box volume (through physical laws of thermodynamics, the dynamics, and the ideal gas laws). A typical grid-box may have a horizontal area of ~100×100 km2, but the size has tended to reduce over the years as computers have increased in speed. The shape of the landscape (the details of mountains, coastline etc.) used in the models reflect the spatial resolution, hence the model will not have sufficient detail to describe local climate variation associated with local geographical features (e.g. mountains, valleys, lakes, etc.). However, it is possible to use a GCM to derive some information about the local climate through downscaling, as it is affected by both the local geography (a more or less given constant) as well as the large-scale atmospheric conditions. The results derived through downscaling can then be compared with local climate variables, and can be used for further (and more severe) assessments of the combination model-downscaling technique. This is however still an experimental technique.

  • Can I use a climate model myself?

    Yes! There is a project called EdGCM which has a nice interface and works with Windows and lets you try out a large number of tests. ClimatePrediction.Net has a climate model that runs as a screensaver in a coordinated set of simulations. GISS ModelE is available as a download for Unix-based machines and can be run on a normal desktop. NCAR CCSM is the US community model and is well-documented and freely available.

464 Responses to “FAQ on climate models”

  1. 1
    tamino says:

    You are so right that statistical models “… aren’t much good for predictions if you know the underlying system is changing …” The ability of statistics to interpolate within the range of known behavior is impressive, but extrapolation into unknown territory is very risky business.

  2. 2
    Bud Ward says:

    What are the major differences between climate models and weather models? Strengths and limitations of both?

    [Response: Conceptually they are very similar, but in practice they are used very differently. Weather models use as much data as there is available to start off close to the current situation and then use their knowledge of physics to step forward in time. This has good skill for a few days and some skill for a little longer. Because they are run for short periods of time only, they tend to have much higher resolution and more detailed physics than climate models (but note that the Hadley Centre for instance, uses the same model for climate and weather purposes). Weather models develop in ways that improve the short term predictions – but aren’t necessarily optimal for long term statistics. – gavin]

  3. 3
    Clive van der Spuy says:


    On what basis then do I consider or assess expensive climate policy proposals that are supposedly justified by (putative/speculative) climate impacts on my home?

    I.O.W. if I do not know what will happen in 100 years time around here where I live, how do I know what to do about it? Or whether I even should do something?

  4. 4
    Clive van der Spuy says:

    Question: Can GCMs predict the temperature and precipitation for my home?

    Answer: No

    On what basis then do I consider or assess expensive climate policy proposals that are supposedly justified by (putative/speculative) climate impacts on my home?

    I.O.W. if I do not know what will happen in 100 years time around here where I live, how do I know what to do about it? Or whether I even should do something?

  5. 5
    Clive van der Spuy says:

    Sorry but I am having difficulty copying and pasting from your site – it does not come through on the final post that appears. My posts also do not seem to reflect.

    Post 1 was in response to:

    FAQ: Can GCMs predict the temperature and precipitation for my home?
    Answer: No

  6. 6
    RickA says:

    Very nice FAQ. I have wondered how the inputs are handled as the models step forward into the future. Do they use the length of the day for the solar input? How do they handle future solar variability as an input? What about the distance of the Earth from the Sun – is than an input (or is that to long term to model in the climate models?) Do the models input the PDO or ENSO, or does that sort of fall out of the models themselves. Thanks in advance for any response.

    [Response: The effect of the orbit is fully taken into account – and indeed variations in the orbit on multi-thousand year timescales are one of the key tests of the model in matching past climate data. Variations in the tropical Pacific arise naturally as part of the model solutions, but getting that variability to match observations is still a huge challenge. – gavin]

  7. 7
    Steve Horstmeyer says:

    As a television meteorologist I frequently encounter the comment that because climate models are using “bad” data they are biased and do not reflect reality, therefore cannot be trusted and global warming either does not exits or exists an there is no anthropogenic influence.

    My approach is to explain that the models are not dependent on observed data. I also explain that given any set of initial weather conditions (wintin reason) a good model will eventually reproduce realistic climate patterns.

    To simplify I restrict the initial conditions under consideration to global temperature only. My example usually sets global temperature to a uniform 10C (50F) with all other variables at realistic values. I tell the viewer that given this scenario, if run for a sufficient amount of time (both model time and computational time of course). the model will reproduce realistic global temperature distributions.

    My experience with the curious but uneducated in climate science reinforces the “keep it simple” approach.

    Ten celsius (50F) works well because that temperature seems to be too cool for equatorial regions and too warm for polar regions in the mind of the average viewer.

    I then explain that a climate model, as the computations are made will warm equatorial regions and cool polar regions.

    The point then is that the model is not dependent on the observed data and because it is a representation of the physical processes governing the climate system the model eventually gets earth’s climate to where it is.

    Often this is as far as I can go without getting a blank stare. For those who remain with me, this point is a great jumping off point to discuss how the same thing can be done with greehhouse gas concentrations.

    If initial conditions are a realistic representation of 1960 earth and the only change made is by increasing greenhouse gasses by the amount that can be attributed to human activity the planet warms.

    I am still met with skepticism in some cases but I feel I have at least exposed members of the public to a very basic look at how this all works.

    1. Anyone see a flaw in the reasoning above?
    2. Anyone have suggestions for improving the approach?
    3. Anyone have and idea for taking this approach farther?

    Recall the average American is barely functionally literate in science so simple is important.

    Thanks for your time,
    Steve Horstmeyer

  8. 8
    auntiewiggly says:

    “GCM – General Circulation Model (sometimes Global Climate Model) which includes the physics of the atmosphere and often the ocean, sea ice and land surface as well.”

    ‘THE’ physics? Hardly. Your bias is showing. ‘Some’, ‘much’ ‘ or simply ‘physics’ will do.

  9. 9
    auntiewiggly says:

    “Initial Condition Ensemble – a set of simulations using a single GCM but with slight perturbations in the initial conditions.”

    Bad definition. An ‘initial condition ensemble’ is exactly what it says – a collection of initial conditions which may or may not be close. You are in fact trying to define something akin to the microcanonical ensemble of statistical physics – a set of states consistent with the external contstraints imposed on the system.

  10. 10

    Here’s another question.

    Is it true that the climate models do not reproduce the the abrupt climate changes that happened in the Northern Hemisphere at the start of the Booling-Allerod interstadial and at the start of the Holocene Epoch?

    If so, how can we be sure that a similar event will not happen in the near future?

    Cheers, Alastair.

  11. 11
  12. 12
    Chris Colose says:

    hey something that works on windows!! Now I have a new toy to keep me busy for a while. Great post.

  13. 13
    Ed says:

    Do any of the GCM models include a negative feedback mechanism as described in the attached link?

    The study found that the cooling effect of the trees is attributed to the release of chemicals that react to form aerosols and convert water vapour into clouds.

    Perhaps the slight warming we’ve seen over the last 100 years is attributable to all the forests that have been cleared all over the world rather than CO2.

  14. 14
    Samson says:

    when I am discussing with climate skeptics, they often refer to the third report of the IPCC (page 774): “In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”
    I don’t now what to answer. Perhaps you can help me and explain how the quote ist correctly to be understand?


    [Response: There are at least two aspects to this question. First, how well do we know the forcing into the future? We can’t do a very good job at estimating the future trajectory of technology for instance, or economic development, and so regardless of how well we understand climate, our ability to predict exactly what will happen is limited. Secondly, we don’t have full information about the current conditions, and so, like for weather forecasts, if there are aspects of climate change that are chaotic, we can’t predict those over the long term. However, it is worth pointing out that the statement does not imply that we can’t know anything about the climate system in the future. We know that if there is a big volcano, the climate will cool – and many aspects of the resulting changes will have been predictable. The same is true for increasing GHGs – the climate will warm. Models can’t tell you exactly what will happen where, but there is a lot they can say. – gavin]

  15. 15
    Chris says:

    My question concerns the interplay of forcings and feedbacks to produce a “climate sensitivity”:

    As I understand it, the warming resulting from a doubling of [CO2] is around 1.1 oC from an increased forcing near 4 Wm-2. This warming will be amplified by feedbacks (assuming a net positive feedback). If the enhanced atmospheric warming from a CO2-induced temperature rise of 1 oC results in enhanced water vapour that gives an additional warming of say x oC, the overall warming (doubled CO2 + water vapour feedback; leaving out other feedbacks for now) will be something like 1.1*(1 + x + x2 + x3…) or 1.1/(1-x)].

    QUESTION: Is there a good estimate for the value of “x”? e.g. “x is greater than 0.2 oC and less than 0.5 oC”? Is that a meaningful question to ask?

    Presumably the water vapour feedback in models is dealt with by determining/estimating/calculating the radiative forcing from water vapour and then making some assumption about the water vapour response to atmospheric warming (e.g. assuming constant relative humidity). If so, then an estimate for “x” above should be an accessible/calculable number (?)

    What I’m trying to get at is some simplistic estimate of the water vapour feedback that results from an enhanced CO2-induced warming of say 1.1 oC from the CO2 RF of around 4 Wm-2. One of the things that people (particularly from an engineering background) have trouble with is the idea that the feedback from a small amount of warming can give rise to a much larger amount of warming, and this seems, from an “enginering perspective” on the meaning of “feedback”, to result in an uncontrolled “runaway” response.

    My understanding is that many of the feedbacks in relation to atmospheric physics are better considered as “amplifications”, since the response doesn’t necessarily feedback into the input (i.e. the feedbacks resulting from raised [CO2] don’t result in very much in the way of further enhanced CO2 concentrations although this happens a bit and is important during Milankovitch warming where enhanced [CO2] is itself an important feedback). Therefore one can essentially add up the “feedbacks”/amplifications.

    Is that a good way of thinking about climate feedbacks in relation to climate sensitivity? ..anyone know of a better way of explaining this to recalcitrant engineers?

  16. 16
    David B. Benson says:

    Alastair McDonald (8) — Those abrupt warmings are related to the collapse of massive ice sheets:

    and, in seems, not global nor as rapid as in Greenland.

    In any case, there are no massive ice sheets going to rapidly collapse at any time in the readily foreseeable future.

  17. 17
    James C Wilson says:

    Question: How are clouds parametrized in the models? How are the parameterizations evaluated? The IPCC reports considerable uncertainty associated with cloud and aerosol forcings. What is being done to address them?

  18. 18
    Mark says:

    Sampson, #12. The response is “how do you know that any errors are there to make the problem disappear?”.

    I may have an error in measuring out the weight of produce of 5% but this doesn’t mean it only ever goes 5% in my favour.

    Then ask if they would have complained about how these scientists were hiding problems with the model if they didn’t say this statement, whether they knew if there were errors or not.

  19. 19
    Mark says:


    Try doing the maths.

    How much would these clouds cool the atmosphere? Remember that a clear night is colder by far than a cloudy one, and whether a cloud is cooling or warming depends on how high it is: high clouds radiate back out into space, low clouds are just “high ground” as far as warming the air is concerned.

    Then if you can’t be bothered to do the sums and put them here for checking, take as read that no, they don’t make the difference.

  20. 20
    Mark says:

    re #8.

    How do we know that an asteroid will hit the earth, splitting it apart and removing the earth from its orbit.

    After all, there’s no model for how this CAN’T happen.

    Check out the phrase “clutching at straws”.

  21. 21

    3. Steve Horstmeyer: Maybe go even simpler: Start with the 19th century laboratory measurements of the absorption of infrared by CO2 at various pressures. Cite the MIT Wavelength Tables, an encyclopedia-sized resource. Tell them about the thermal runaway on Venus. Talk about the measurement of CO2 concentration that has been going on in Hawaii since 30 or 40 years ago. It is really hard to understand how anybody could avoid the idea that the absorption bands of every gas have been cataloged and re-measured about a jillion times. Perhaps all high school students should be required to take 4 years of physics in which they spend about a semester measuring the spectra of CO2 and other things.

    We have known since the 19th century that CO2 absorbs infrared. Oxygen and nitrogen have windows where CO2 does not. That is how we know that it is CO2 and other gasses that are causing global warming.

    We have satellites that are measuring the solar energy coming in and the heat radiation going out. More is coming in than going out, so the earth has to be getting warmer.

    Maybe talk about how much less ice there is in the Arctic Ocean recently. You have to tell them that the “Normal” temperature displayed on your 5 day weather prediction is a running average of only the last 10 years. You have to say this is because the George W. Bush Administration won’t let you use all of the data since 18?? when that data was first recorded in your area. In other words, you have to admit that the W. government requires you to lie to cover up global warming. You have to remind them that it is global not local, a small number of degrees, and an average not an absolute. Then say: “The GCMs use the laboratory data on the absorption of infrared by CO2 and a whole lot more science to try to make climate predictions. The problems are that computers aren’t ‘big’ enough to take everything into account, and the butterfly problem happens.”

    Tell them the following: Global Warming Has Already Happened. In the mid 19th century, the Mississippi river froze over in the winter so you could drive on it at St. Louis. That’s how St Louis became known as the gateway to the west. Now the Mississippi river is ice-free at Davenport, Iowa. If you want to drive on the river, you have to go at least as far North as Minnesota. Cattaraugus County New York [Olean, Little Valley] got 450 inches [37.5 feet] of snow per year in the 1950s and 1960s. Now it gets only 96 inches of snow per year. At Barrow, Alaska, the grave yard washed away because the fast[ened to the land] sea ice melted. We humans have caused 1.3 degrees Fahrenheit of global warming since we started burning coal in 1750.

    In other words, you have to try to teach a little basic science to people who weren’t smart enough to be able to take science courses when they were in high school. In addition to the IQ problem and the amazing ignorance problem, some of them may be coal miners or other coal industry involved people who want to keep on mining coal.

    You could give them a list of URLs, such as:

    Remind them that the coal industry puts up a lot of phony web sites to distract them.
    You could list some books, but my guess is that you are talking to people who have marginal reading skills and no plans to read books.

  22. 22
    Mark says:

    Bud, the differences are:

    Weather. Small scale transients. The result of input operating under chaos.
    Climate. Large scale time invariants. The result of inputs operating under forcings.

    Think of rolling a rugby ball from the top of a hill. There are dips, bumps, hedges, bushes and the occasional dog on the slope.

    Where is it going: Downhill.
    What path will it take: Uh, depends, really.

    Climate=where is it going
    Weather=what path will it take

    Just because you think it *might* hit that bush and go *that* way but with low confidence and none further down the hill doesn’t mean the ball will roll uphill.

  23. 23
    Hank Roberts says:

    Anything on paleo climate models, especially the proprietary ones used in petroleum exploration, would be welcome. Tidbit here:

  24. 24
    Patrick 027 says:

    VERY VERY good post! Thanks!

    Is there any sense of how, how much, if, or which parameters ‘tuned’ for emergent properties in the average climate might change with climate change? Could comparing different months, seasons, and years help in that?

    Could the climate forcing itself, such as increasing GHGs, affect parameterizations independently of the larger scale climate changes (for example, by changing thermal damping of various kinds of waves, or by changing the differences of radiative effects between different amounts and kinds of clouds)?

    Would there be any advantage to having seperate models at seperate scales, where smaller scale processes would be modelled at fine resolution, and the larger scale model would, for each unit grid and time, search the results of the smaller model based on similarity of input conditions, perhaps interpolating and if necessary using a randomly chosen result based on probability distributions, and in the ocassion where the results of the smaller model are too sparse, telling the smaller scale model to do a new run (as time goes on this would happen less often)?

    Has there been any success with models that would attempt to model climate ‘directly’ via some theoretically or otherwise justified parameterization of all weather?

    To what extent can model results be explained conceptually via cause and effect relationships (for example, the storm tracks move here because the tropopause did this…) and where could I go to find more of that?

  25. 25
    Bob North says:

    Simple logistics question – With today’s higher speed computers, about how long does it take to complete a single GCM model run simulating 100-150 years?

    [Response: It’s always taken about the same amount of time (roughly a month). As computers get faster, we add in more stuff. – gavin]

  26. 26
    jcbmack says:

    Well said. The models are increasing in complexity and are at least approaching more true to life climate conditions. To echo tamino, statistical methods cannot factor in all the real time changes. The signal-noise ratio can be a confounding thing indeed when analyzing the raw stats. Still besides empirical observations and good inferences, the models are what are available to work with, and they serve several purposes quite well and they are the basis for improved models in the future.

  27. 27
    Dietrich Hoecht says:

    I have stumbled across the recorded global temperature increase chart for the last century. The wiggles are smoothed with five-year averaging. The graph shows clearly three things. It can be divided into roughly three equal parts. The first is a straight line increase. The second is a temperature plateau, even decreasing slightly. The third is another straight line increase at the same gradient as the first one. What caused the plateau? This is a significant matter, since the century is center to the manmade carbon dioxide creation. We all know the hockey stick curve steep increase of CO2, and its latency in the atmosphere. Since the general proclamation is a direct causal, albeit somewhat delayed rise in global temperature, where does the plateau come from? This 30-plus year phenomenon is not just an insignificant kink in the straight-line curve. Something bent the curve drastically. With all the theories, clamor, explanations of global temperature rise I have not found a single reasoning for this plateau. Why is this important? Well, aren’t there real doomsday numbers being credibly disseminated, of 5 to 10 degrees Celsius increase within the next 100 years? Could we not see another plateau, a stoppage, a real drop-off? Please, can someone give me a simple explanation, and not a “definite may-be” obfuscation?

  28. 28
    Robert Smart says:

    I presume we are now running models which assume a lot more open water in the Arctic. How do the models then play out? Not obvious to me. Does the reduced albedo of the ocean mean that more energy is absorbed instead of reflected? Or does the increased evaporation cause more snow on the neighboring land and hence more reflection? This question seems important to me because we notice that temperatures rose through the previous interglacial (which was warmer than this one), then crashed: so it seems natural to look for some switch that got flipped. Snow and ice on land can build up, unlike ice on water.

  29. 29
    Ike Solem says:

    Nice summary post. Re#8, any changes in climate over glacial-interglacial timescales have to take into account an additional component: the biogeochemical cycling of atmospheric gases.

    The climate models as described here won’t produce glacial/interglacial cycles if run for a long time, and that is because they treat the atmospheric content of trace IR-absorbing gases (CO2, methane and N2O) as external forcings. Thus, for a 100-yr climate model run, the year-by-year atmospheric CO2 content is set by the researcher.

    There are many reasons that the atmospheric gas content can change. Purely physical processes like wind-driven mixing can increase the uptake of CO2 by the oceans, but biological processes also play an important role, as does the temperature difference between the air and the water:

    This kind of biogeochemical analysis is needed to understand the overall role of CO2 and methane in the global climate. It is also needed to evaluate the effectiveness of any carbon trading programs designed to limit CO2 buildup in the atmosphere.

    For example, a biogeochemical model can be used to show that dumping iron in the oceans will have no effect on atmospheric CO2, as any increase in algal growth will be accompanied by increases in remineralization of algal biomass in the water column. The only way to permanently remove the biomass is to bury it beneath sediment.

    You also need a biogeochemical approach to answer questions like “what gases will the melting permafrost add to the atmosphere?” The main control over microbial activity is temperature, so as northern soils and permafrost warm, there should be a flux of CO2 and/or methane to the atmosphere. If it is very wet, that encourages methane formation. The process is entirely analogous to slowly defrosting a freezer full of food – it starts to outgas as microbial decomposition sets in.

    Any estimate of the effect that preserving a forest has on the atmospheric CO2 also must be based on some kind of biogeochemical approach. In northern forests, the average age of leaf litter might be 5 years, but in tropical regions, that might just be six months. The difference is mostly temperature – microbial decomposition never lets up in the tropics. A standing forest, in that respect, is a bit like a living coal field – or a coal field is a fossilized forest. In either case, the carbon accounting is done the same.

    This is all discussed in the IPCC ( but is usually left out of news reports, which typically fail to discuss the fundamental differences between CO2 emissions from living biomass, deforestation emissions, permafrost emissions, and fossil fuel CO2 emissions. Not all carbon dioxide emissions are the same, not by a long ways!

    Fossil fuel carbon last saw the atmosphere millions of years ago, and that’s why burning it leads to a net accumulation of CO2 in the atmosphere. It’s a basic but widely misunderstood point.

  30. 30
    Jim Cross says:

    How are solar variations represented in the models?

  31. 31
    Dave Andrews says:

    Guess what? You have described very well the economists/bankers mathematical models that predicted everything was rosy in this best of all possible worlds. Trouble is it turned out to be a house of cards, and there is no evidence that climate models are any better, and even less evidence that they should be used as the basis for policy decisions.

  32. 32
    Neil McEvoy says:

    Thank you – the FAQ has added to my knowledge considerably.

    Two questions:

    (1) I believe that the global mean temperature in the 00s has increased less quickly (if at all) in the last 10 years or so, compared to the previous two decades. Have any of the models been adjusted in the light of measurements from the last 10 years? Do newer models fit that data better than older ones?

    (2) What proportion of model runs from a multi-model ensemble produce global mean temperatures at or below (on average) the actual measurement for the last 10 years?

  33. 33
    Mae Frantz says:

    My Climatology professor says GCMs that predict global warming are based on a doubling of CO2 from 350-700 PPM in the next 50-100 years. Is this correct, and if so, why is a doubling assumed when CO2 has increased approximately 100 PPM since pre-industrial times? If someone can point me to a source that explains why 700 PPM is a valid prediction and input, I would be much obliged.

    [Response: Look up the IPCC SRES scenarios to get a handle on what CO2 emissions could plausibly lead to. – gavin]

  34. 34
    Garry S-J says:

    Samson (#12 3 November 2008 at 12:29 PM):

    You could try quoting the very next sentence to them: “The most we can expect to achieve is the prediction of the probability distribution of the system’s future possible states by the generation of ensembles of model solutions.”

    I’d have thought a probability distribution of the system’s future possible states would be enough to inform rational decision-making, though of course some people will never be satisfied.

  35. 35
    David B. Benson says:

    Dave Andrews (27) — The important difference is that the climate models are based on physics whilst the economic ones are not.

  36. 36
    Patrick 027 says:

    23 Clarification:

    “(for example, by changing thermal damping of various kinds of waves, or by changing the differences of radiative effects between different amounts and kinds of clouds)?”

    Regarding the second part, I didn’t mean to imply that that would be parameterized; but it’s an example of what could change. As for the first part, I got the impression that it might in some cases be parameterized, but I’m not sure if there’s much of an effect.

    Other questions:

    Do climate models currently take into account the solar UV -stratosphere-ozone aspect of solar forcing?

    What might change (educated guesses) in model output if the dynamics of the upper atmosphere were better resolved?

  37. 37
    Craig Allen says:

    Dave Andrews:

    GCMs are able to do hindcasts – i.e. they are able to fairly accurately replicate both recent and ancient past climates and climate shifts.

    I would be fascinated to hear of any economic models that are capable of doing this – e.g by replicating the great depression, the Dutch tulip bubble or the impacts on each national economy of the abolition of slavery.

  38. 38
    jcbmack says:

    The data ultimatley dictates everything, but the models are very useful. No matter what stats are used, eventually the accuracy and is determined, the models are coming along nicely. Even if one were to use a lowe end estimate, global warming is a major issue, not in great dispute, even non peer reviewed data is useful, just that the peer reviewed data and model designs have been shown to have a lot of relevance.

  39. 39
    Chris Colose says:

    #14 Chris

    Since climate feedbacks are a converging series, the value of x as you set it up must lie on an internal from zero to one. Though, the effect is generally expressed in units W/m**2/K to express the change in the top-of-atmosphere radiative balance…where the longwave and shortwave components of water vapor feedback is roughly 1.13 and 0.27, respectively (Soden et al 2008). The water vapor feedback itself roughly doubles the sensitivity of climate, but note that its magnitude also depends on other feedbacks. Whether or not relative humidity is conserved is going to effect its magnitude, but this is not an assumption as you put it, but an emergent property of models.

  40. 40
    jcbmack says:

    Edward # 20, well said! It is apalling how many websites and papers are put out by company funded projects in the name of “science.”

  41. 41
    jcbmack says:

    Chris # 14 Airplane ingineers deal with forces of great magnitude all the time. For example wind shear can be quite powerful and ingineers rely upon data and careful calculations to design a plane that will not have a cockpit that will snap off. In this sense, the airplane may be “sensitive,” to some sort of force of some magnitude and vector, (consider also tail winds, cross winds and head winds) Usually ingineers do not work in values of x. The Earth is a natural system with both natural and artificial influences, but inputs can be greatly magnified; if one were to ‘ingineer,’ a sound system that amplifies sound volume (apmlitude) and enhances pitch (a function of frequency) does the volume run away? The climate sensitivty of the planet can be greatly increased by both precipitating events and positive feedback being greater or more abundant than negative; slower to develop as well. Constant relative humidity? Could you expand on that?

  42. 42
    Craig Allen says:

    PS to Dave Andrews re economic vs climate models:

    Also, James Hansen successfully predicted in 1981 the trend of the past several decades of global warming, including a good approximation of the noise around the trend. And he did so with a climate model that was far less sophisticated than those of today.

    Can you provide an example of any economic model that has been anywhere near as successful?

  43. 43
    Bob North says:

    Gavin – thanks for your quick response to my question (#24). That is actually a good bit longer than I expected.

  44. 44
    Lawrence Brown says:

    Since models are about 100×100 KM^2 horizontally, or even somewhat less, the grids would contain elements like forested and/or farm areas,cloud cover and urban areas that are smaller than the grid, that effect climate in the grid as a whole. How would models generally treat the entire grid to account for these effects?
    My take is that the statistical downscaling described by Rasmus in his post on Regional Climate Projections, doesn’t apply to entire grids.

  45. 45
    jcbmack says:

    On a short note I have to say this site is the best for information and blogging on GW issues.

  46. 46
    Bob North says:

    Re: Edward Greisch #21

    Ed – you have a lot of useful information and good links, why mix in some falsehoods and BS? Reported normals on weather forecasts are for 30 year averages, not 10 years, and have been for a long time. Dubya has nothing to do with it and isn’t requiring any weather people to lie to cover up Global warming.

    Although I can’t say whether the Mississippi froze at Saint Louis regularly during the 19th century, I am sure that is not how St. Louis got the nickname of “Gateway to the West”. I don’t know where you got the idea that Cattaragaus County in NY routinely got 450 inches of snow during the 50s and 60s since the record annual snowfall in NY is 379.5″ at Hooker 12 NNW (in 1979), which is nowhere near Cattaragaus county, but in the snowbelt east of Lake Ontario.

    Just stick to the facts and don’t embellish.

  47. 47
    Jason says:

    Regarding response to #33.

    Gavin, are you aware of the complete disconnect between SRES estimates of fossil fuel reserves, which are based on a single review paper by Roger in 1997, and more recent views regarding peak oil, peak gas, and peak coal? Ever read about this stuff, e.g., Dave Rutledge or James Hansen work?

    Not that I think it matters too much…there’s far too much co2 in the air already, but SRES is a joke in my view. Economic growth forever? Come on.

    [Response: I was simply answering the question. Scenarios are being updated all the time, but the only thing that is clear is that there is plenty enough carbon to take us significantly above where we are now. Oil, coal, shales, hydrates or whatever. Peak oil is not going to save us – and I know that Hansen doesn’t think so. – gavin]

  48. 48
    William says:

    Re #33

    If CO2 production grows at 2% annually, that is a doubling about every 30 years. But take a moment to think about what a doubling power series really means – it took me a while to get my head around this, and I think most people don’t appreciate the magnitude of such growth – but the effect is that every 30 year period results in as much CO2 as the ENTIRE PREVIOUS HISTORY (since the exponential growth began).

  49. 49
    jim norvell says:

    How do you model clouds?

  50. 50
    tharanga says:

    Thank you for the chance to ask some basic questions.

    This has bothered me in the past. You say that the hindcasts are true hindcasts, and that parameters are not fit to match the historical trends. That’s all very well and good.

    So can you help me understand what is happening here?

    All the GCMS do a decent job of the hindcast, yet diverge in the forecast. They’re all using the same scenario here, so the forcings ought to be the same.

    If the GCMs have slightly different physics, why would that show in the future, but not the past?

    [Response: One issue is the inertia in the oceans which means that the 20th century changes are not in equilibrium. Therefore, the changes so far aren’t impacted by the full equilibrium sensitivity. Transient sensitivities are much closer across the models than the equilibrium value. But as the signal gets larger, the models will diverge a little more. – gavin]