RealClimate

Comments

RSS feed for comments on this post.

  1. Ok Gavin, all well and good, and I am sure that this “new archive” will continue to confirm your previous “findings” of Global Warming based on human activities. I put it to you that all of your models may in fact be based on a major flaw. The assumption that the earth is round and not flat. Why don’t you do a complete recalculation based on the flat earth thesis and then we will see. Of course as a sceptic with lots of opinions and no scientific training I cannot be expected to do any actual real work on this matter but I do expect you to turn your life’s work upside down to answer my argument (that is how the game played isn’t it?).

    Besides even if the world warms and the Arctic, Antarctic and Greenland ice caps all melt it will not be an issue. Like I just told you the earth is flat and the excess water will just fall of the edge. What stops the existing water from flowing off, I hear you ask, well you are the scientist why don’t you work it out?

    Actually I am continually amazed at the amount of well reasoned detail presented on this site and your seemingly endless patience in explaining and re-explaining the basics of the science.

    Thank you

    Gerald

    Comment by gerald — 4 Feb 2008 @ 12:40 PM

  2. Very clear, except:

    What is WMO?

    [Response: World Meteorological Organization. -rasmus]

    Comment by David B. Benson — 4 Feb 2008 @ 1:00 PM

  3. Thanks, Gavin, for this interesting insight. I’m somewhat intrigued by the performance of the ensemble averages. Has anyone done any sort of jacknifing or bootstrapping analysis that looks at performance eliminating one or more models from the average? Since you don’t really know how things are distributed, this might be the only way to explore the variability.

    Comment by Ray Ladbury — 4 Feb 2008 @ 2:02 PM

  4. Archives are very good, yes. However, as a science communicator / technical illustrator, I find the “Terms of Use” in these model comparison projects to be toxic:

    These data are for use in research projects only. A ‘research project’ is any project carried out by an individual or organized by a university, a scientific institute, or similar organization (private or public) for non-commercial research purposes only. Results based on these data must be submitted for publication in the open literature without any delay linked to commercial objectives.

    I can’t so much as make a Wikipedia plot without violating those terms, since those plots aren’t part of any “research project” intended for academic publication. And I certainly couldn’t use the data to make plots for books, magazines, or other commercial publications.

    Computer climate modeling is (usually) funded by public funds, so it seems strange to create any strong restrictions on the further use of that data.

    You mention:

    the archive will soon be available with no restrictions and hopefully that setup can be maintained for other archives in future

    Does that include relaxing the non-commercial, academic paper oriented requirements?

    [Response: Non-commercial will probably stay. That is because many of the national modelling groups (not in the US though) have mandates to make money as well as do research and would not contribute to these archives if they felt their paid work would be undermined. I imagine that they are worried about someone else downscaling their projections and selling it as regional climate forecasts. However, I see no reason for other restrictions. Open access should definitely allow, nay encourage, more casual use of the data. I will query the organisers and see what is planned. - gavin]

    Comment by Robert A. Rohde — 4 Feb 2008 @ 2:10 PM

  5. I’d like to thank the climate science community in general for the high degree of free access to data. I’m a data analysis junkie, after all, and I could spend several lifetimes just analyzing the data I’ve already saved on my computer.

    It would be lifetimes well spent.

    Comment by tamino — 4 Feb 2008 @ 2:15 PM

  6. Gavin,

    I think it’s a bit of an understatement to say that not everything at the archive is done to everyone’s satisfaction.

    In fact, there’s some mysterious vetting process that comes into play when you register at the archive. If you pass, you’re granted access. If not, you’re invited to apply again. And again. And again.

    An example: I applied for access to the archive last year. My goal was to add this valuable data to an archive that I maintain that makes it relatively easy (I hope) to download and/or visualize geophysical data. I was repeatedly denied access. It took four months and the threat of a formal FOIA request before I was allowed any access at all. And I still only have access to the US model data.

    Any idea when this archive will truly be open?

    [Response: The problem was that some groups were not ok with third parties hosting the data. This was not a problem for any of the US groups though. The fully open access was proposed a couple of months back and no-one (AFAIK) seemed to mind. Thus I would anticipate that it is imminent. If someone who has actual knowledge wants to let me know, they can email or leave or comment here. - gavin]

    Comment by Joe — 4 Feb 2008 @ 4:09 PM

  7. Since you ask for suggestion(s):

    ‘AMIP’ runs are atmospheric model runs that impose the observed sea surface temperature conditions instead of calculating them with an ocean model, optionally using other forcings as well and are particularly useful if it matters that you get the timing and amplitude of El Niño correct in a comparison.

    Wouldn’t it be interesting to run a large ensemble of coupled runs and cherry-pick the ones whose SST’s match observations? You could compare the range of weather patterns with that of both the “un-picked-over” ensemble and the ‘AMIP’ runs.

    Comment by AK — 4 Feb 2008 @ 4:19 PM

  8. David B. Benson @ 2: WMO = World Meteorological Organization, http://www.wmo.ch/

    Comment by Nick Barnes — 4 Feb 2008 @ 7:19 PM

  9. Re. 1 “What is WMO”:

    World Meteorological Organization
    http://www.wmo.ch/pages/about/index_en.html

    “The World Meteorological Organization (WMO) is a specialized agency of the United Nations. It is the UN system’s authoritative voice on the state and behaviour of the Earth’s atmosphere, its interaction with the oceans, the climate it produces and the resulting distribution of water resources.”

    Comment by Gerry Beauregard — 4 Feb 2008 @ 7:41 PM

  10. “The other way to reduce download speeds is to make sure that you only download what is wanted. ”

    I think you meant “the other way to increase download speeds…”
    minor point.

    [Response: true. I've changed it to 'download time'. - gavin]

    Comment by Chad — 4 Feb 2008 @ 8:18 PM

  11. re # 1

    My Google must be dumber than yours. I get “World Meteorological Organization (WMO) Homepage – Organisation …World Meteorological Organization – Official United Nations’ authoritative voice on weather” over and over. I can not find enough differnt possible choices to turn it into a good question.

    Comment by Aaron Lewis — 4 Feb 2008 @ 8:20 PM

  12. Have any statisticians experienced in meta-analysis looked at these archives? Meta-analysis , especially using Bayesian approaches has been used a lot especially in medical fields.

    The differences in the plausibility of models to me seems to invite a Bayesian approach. I would suggest seeking out some statisticians with the appropriate expertise.

    Comment by Lloyd Flack — 4 Feb 2008 @ 9:00 PM

  13. #1
    world meteorological organizaion

    Comment by Chris Colose — 4 Feb 2008 @ 9:10 PM

  14. Re: #1 (gerald)

    I nominate you for “best comment ever.”

    Comment by tamino — 4 Feb 2008 @ 9:50 PM

  15. This post/thread offers some interesting and helpful insights. A couple of questions: 1) I didn’t fully comprehend “unfunded”. Does this mean that no funds were from IPCC, WGCM, or WMO or other UN-related body and that all funding came from the enterprises employing the scientists and buying the computers? Surely the guys and gals did not work for no pay…??? Question is: who paid the bills?

    2) I too am bothered by the raw averaging process for meta-ensemble. But is ther any other process that could have been clearly better? Also, you imply that everyone ‘lucked out’ when the meta averaging came out better that any one model (I assume by comparing historical information…???). But none-the-less shouldn’t there still be a pile of concern/nervousness/interest? There seems to be at least a small thread that says you have validated the models with a process a little like playing the slots.

    Comment by Rod B — 4 Feb 2008 @ 10:16 PM

  16. Lloyd, search: +climatologist +Bayesian

    See also: http://www.cell2soul.org/issues/article.php?issue_dir=v2/i2&article_num=a18

    The Patient from Hell: How I Worked with My Doctors to Get the Best of Modern Medicine and How You Can Too by Stephen H. Schneider, with Janica Lane

    De Capo Lifelong Books (2005); 300 pages;
    ISBN: 0738210250
    http://www.patientfromhell.org

    “… applied subjective probability analysis (Bayesian updating) based on knowledge, experience, and intuition when conclusive hard data is lacking; examined historical data to calculate risk, from mild to catastrophic (risk = probability + consequence); and repeatedly determined whether to push for a Type I risk, where one spends the money and acts to prevent a bad outcome despite lack of surety of the symptom, or whether to accept a Type II risk, where one saves the money and doesn’t act, accepting that the consequences may be disastrous after all….”

    Comment by Hank Roberts — 4 Feb 2008 @ 10:49 PM

  17. Re: Gavin’s response #4,

    If there have to be restrictions to appease some national modeling groups, then those restrictions really should be displayed at the dataset level rather than having a blanket policy covering the entire portal. Appeals to the most restrictive preferences are inherently counter-productive. Not to mention that even within modeling groups, some data is treated more freely than others (e.g. the Hadley Centre policy on summary datasets).

    [Response: Agreed. - gavin]

    Comment by Robert A. Rohde — 4 Feb 2008 @ 11:43 PM

  18. gavin:

    Expectations from a meta-ensemble are therefore low. But, and this is a curious thing, it turns out that the meta-ensemble of all the IPCC simulations actually outperforms any single model when compared to the real world. That implies that at least some part of the model differences is in fact random and can be cancelled out.

    Isn’t this just a result of — and an indication — that all models contain flaws, but they are mostly different flaws for each different model? Then, when you construct a meta-ensemble, you just ‘dilute’ each flaw with all the non-so-flawed other models. No statistics/stochastics/randomness involved.

    Do I miss something?

    [Response: Yes. But that is a statistical effect. The flaws (in some measure) must be statistically independent. - gavin]

    Comment by Martin Vermeer — 5 Feb 2008 @ 2:01 AM

  19. Many thanks for this, Gavin, very useful.

    I would add to some of the comments above about the importance of considering other statistical approaches and modes of presentation. In my own work, I use metamodels and bootstrapping techniques, and am moving towards adopting some of the Bayesian methods noted above e.g. based on Kennedy and O’Hagan, 2001 (although I don’t myself think they are always the best approach or the most suitable – depends a lot upon the aim of the intended analysis and type of research question).

    One thing I am also working on is the presentation of altenative predicted futures and associated risk, using a modified kind of contingency table. By this method, one takes into account the equifinality inherent in most, if not all, of the types of model commonly used in the geosciences; I would imagine that climate models suffer similar problems, so it would be useful at some point (when I can make some time, dammit!) to have a go at these, too. Hence an archive like this is of great value.

    More of the same!

    Comment by Nick O. — 5 Feb 2008 @ 5:56 AM

  20. Rod B. asks about the issue of underfunding:
    “Surely the guys and gals did not work for no pay…??? Question is: who paid the bills?”

    Actually, that is probably precisely what it means–a lot of unpaid overtime. This is quite common in the sciences. I typically work 60 hours in an average week. During crunch times I work 80. I am not atypical. We rationalize this often by saying our work is our hobby as well as our day job. The fact of the matter is that grants rarely cover all the work that needs to be done to publish papers. The government learned long ago that most scientists are more motivated by a challenging problem than by a paycheck.

    He then asks about the averaging process for the ensemble and whether there might be a better way of doing the average.
    One possibility might be to use weights based on how the models according to various information criteria (e.g. AIC, BIC, TIC, etc.). This would allow models of different complexities to be compared and ensure that overfit models were downweighted approproately. Note that the AIC would (of course) have to be based on the information used to calibrate the model, not how well the model predicted the phenomenon under study. Actually, model averaging has been found to outperform the results of any single model, especially when models are appropriately weighted. What this may be telling you is that no particular model is vastly superior to any other. Also note that such a process is compatible with Lloyd’s suggestion of a Bayesian meta-analysis.

    Comment by Ray Ladbury — 5 Feb 2008 @ 7:51 AM

  21. I don’t beleive IPCC – it estimates that aviation is responsible for around 3.5% of anthropogenic climate change?!
    http://www.climateactionprogramme.org/news/article/reducing_airline_emissions_can_it_be_done/

    Comment by Peter — 5 Feb 2008 @ 8:05 AM

  22. Three things:

    1) Next time round we are planning on a distributed archive, which will allow download of subsets (in space and time). Planning for this is actively underway. It’s not obvious that download speeds will be actively enhanced by the distribution though, because the data volumes this time around are likely to minimise the number of copies which can exist.

    2) Regrettably it is likely that much data in the archive will have more restricted access conditions than the American participants can allow. Gavin’s explanations is correct.

    3) Funding: In general the UN and IPCC can’t spend money, they ask the nation states to do things, which then get funded “locally”. The AR4 archive that Gavin is describing was funded by internal US funds. As I understand it, it was unfunded in the sense that the archive hosts moved money from another task to support the archive. For the next assessment report we hope that a number of other nations will be contributing effort and sharing the load …

    Comment by Bryan — 5 Feb 2008 @ 8:41 AM

  23. Speaking of model meta-analysis – is there a comparison available of the slow response times/lags in the system, among the different models? I was curious, for example, how much further different metrics would change (and how quickly) if magically, all greenhouse gas emissions stopped completely tomorrow. Or, alternatively, if the CO2 concentration instantaneously jumped from 280 to 380 ppmv, how long it would take for metrics to approach a new equilibrium.

    [Response: The simulation that everyone did was to fix concentrations at 2000 levels ('committment runs'). It takes a few decades to get 80% or so of the way to equilibrium, and much longer for the remaining 20%. Sea level rise (due to thermal expansion) continues for centuries. There was a paper by Meehl et al (2005) that discussed this. - gavin]

    Comment by tharanga — 5 Feb 2008 @ 9:06 AM

  24. Gavin,
    In your response to #23 you talk of the decades and more it takes to reach equilibrium once CO2 concentrations stabilize. I presume this passage of time is related to the “heat in the pipeline” and the lag in heating the oceans. While this is logical (to me) and presumably based on sound science, I don’t see a lag in temperature behavior as depicted by the GISS land and ocean surface temperature record here:

    http://data.giss.nasa.gov/gistemp/graphs/

    The ocean surface temperatures go up and down (at a smaller magnitude) at the same time as the land surface temperatures. I see no evidence of a lag going back to the beginning of the record in 1880. Why is this?

    [Response: In Fig A4 the ocean temperatures are clearly damped compared to Land. There are multiple timescales here though - some are short (seasonal and interannual) which make many short term anomalies line up, but the long time scales come into the problem for the long term trend and they are the ones that come into play for the "in the pipeline" effect. - gavin]

    Comment by B Buckner — 5 Feb 2008 @ 11:02 AM

  25. Peter says: “I don’t beleive IPCC – it estimates that aviation is responsible for around 3.5% of anthropogenic climate change?!”

    And since your “belief” is not based on any evidence or even any facts that you have cited, it is relevant to the discussion exactly…how?

    Comment by Ray Ladbury — 5 Feb 2008 @ 11:06 AM

  26. An excellent initiative! Please keep pressing for maximum access, Gavin.

    Comment by Nick Gotts — 5 Feb 2008 @ 11:16 AM

  27. Is there any way to establish a ranking of the 17 modelling groups, by their success in predicting changes in climate – if that is what they are working on? If their goal is something different, how would success in that endeavor be measured? Thank you.

    [Response: The best you can do so far is assess the skill of the climatology (no trends) - Reichler and Kim have done some work on that (see here). Success in projections will need some more time. - gavin]

    Comment by Dodo — 5 Feb 2008 @ 11:29 AM

  28. Ray, I like your notion of wanting the “exact” information on relevant issues. Speaking towards this, please tell me exactly how the IPCC gets to 2.5-3 degrees C temperature increase from doubled CO2.
    I thank you in advance for your assistance.

    [Response: We've gone over why the 'best guess' climate sensitivity is 3 deg C a dozen times. It's not a secret. - gavin]

    Comment by Gaelan Clark — 5 Feb 2008 @ 11:52 AM

  29. A few more comments:

    1) Funding: I’m not sure how much finding (if any) the archive hosts received for
    the AR4 simulations. However, they apparently have received about
    $13.5 million
    (over five years) for the next round of
    simulations. I don’t know if any of this will go to the modeling
    centers to defray the costs of preparing data for archival.

    2) Last time I checked, none of the AR4 data (including US data) were
    available for public download via ftp or http. Even though the USA data
    *are* freely available via OPeNDAP, there is no mention of this anywhere at
    the archive Web site. There’s no reason why there should be any
    restrictions on access to the US data.

    3) One of the WGCM members told me a few months ago that rumor had it
    (he wasn’t actually at the meeting) that at least one
    modeling group (Hadley) was opposed to opening up the AR4
    archive.

    4) The rules for who is or is not allowed to access the archive need to be
    clearer. For instance, who decides whether an applicant is
    permitted to use the archive? What are the ground rules for making
    this decision?

    5) It’s really unfortunate that the next archive will be as restricted as
    the present one. I think the modeling and archive centers have a lot
    to learn from the open source software community.

    [Response: I can't speak for other modelling groups, but much of the GISS data is available on our own servers. GISS received no additional money over our standard model development grants to do simulations for AR4, and most groups were in the same boat. Discussion of how the next archive will be set up is ongoing, and you should be vocal in sharing your concerns. None of these things appear insuperable. - gavin]

    Comment by Joe — 5 Feb 2008 @ 12:11 PM

  30. The aviation figure does seem high to me — has anyone checked whether it makes sense? What are the respective magnitudes involved? I don’t think you could get that much from the well-mixed greenhouse gases alone produced by jet engines; is there an effect from creating high-altitude cirrus clouds?

    Comment by Barton Paul Levenson — 5 Feb 2008 @ 12:12 PM

  31. Thanks to all who responded regarding WMO. I ought to have guessed the answer. Anyway, the link was of interest.

    Comment by David B. Benson — 5 Feb 2008 @ 12:54 PM

  32. Gaelan, I’m not sure I understand your question. I presume you can read English. You are as capable of going to the summaries as I am. Or did you just want to say something clever and this was the best you could do on short notice?

    Comment by Ray Ladbury — 5 Feb 2008 @ 1:23 PM

  33. Re #21

    A Boeing 747-400ER starts with 63,500 US gal of fuel for a long flight, which is about 50% of the take-off weight. The new super jumbo A380A has a capacity for 83,500 US gal of fuel. The fuel burn rate of a Boeing 737-400 is about 3,000 liters per hour. If you add up all the emmisions from all classes of aviation (i.e., private, commercial, government and military), I wouldn’t be suprised that total is much higher.

    Hydrocarbons fuels will always be used by boats, planes, freight trains and trucks, construction, mining
    forestry and agricultural machinery, all military vehicles and mobile weaponary (e.g., tanks), most all cars and light trucks, diesel-electric generating systems which are used extensively thru out the world (e.g., in small countries and at gold and diamond mines), etc because these fuels have high energy densities. They are readily prepared from crude oil by fractional distillation and blending, low energy processes that do not involve the breaking and making of chemical bonds. These fuels are highly portable and can be stored indefinetly in sealed containers or tanks under an inert nitrogen atmosphere. Hydrocabons fuels are chemically inert (except to oxygen, halogens and certain other highly reactive chemicals) and do not corrode metals or attack rubber and other materials(e.g, gaskets) used for construction of engines. Since gasoline has a flash point of -40 deg C, it can be used in very cold climates.

    Some other heavy hitters that will always use megagobs of fossil fuels are lime and cement kilns, metal smelters (especially steel mills), founderies making engine blocks, pipe, tools, big nuts and bolts, rail car wheels, etc, all factories that manufacture ceramics materials and products (e.g., bricks, blocks, tiles, pottery, dishes, glass sheets and bottles, etc), all food preparation (e.g, large bakeries) and processing (e.g., sterilization). Fossil fuels will always be used for residential space and water heating especially in cold climates.

    The chemical process industries require petroleum feedstocks and use lots of energy for the manufacure of an amazing array of materials such as carpet and cloth fibers, paint, plastics, exotic material like silicones and Teflon, and so forth.

    I could go on on listing all the human activities that will always use fossil fuels because there never will be any suitable and economical subsitutes with requisite physical and chemical properties.

    The bottom line is this: There never ever will be a reduction in the consumption of or the phase out fossil fuels and consequently no reduction in the emission of greenhouse gases.

    Comment by Harold Pierce Jr — 5 Feb 2008 @ 1:33 PM

  34. Re Harold Pierce @ 21: “Hydrocarbons fuels will always be used by boats, planes, freight trains and trucks, construction, mining…”

    Mainline electrified freight railroad milage was once quite extensive in North America and there is no reason why it could not be again. Of course, there would be no net gain unless the electrical power is produced without burning fossil carbon. Ocean shipping was once entirely wind-powered and there is great potential for at least reducing the amount of fossil carbon fuel use by augmenting with sail power.

    “Some other heavy hitters that will always use megagobs of fossil fuels are lime and cement kilns, metal smelters (especially steel mills), founderies making engine blocks, pipe, tools, big nuts and bolts, rail car wheels, etc…”

    Perhaps you’ve not heard of electric arc furnaces, which have extensively displaced open hearth furnaces in steel making and smelting, with the electricity sometimes even produced by carbon-free hyrdoelectric plants.

    Never say never, it’ll trip you up every time.

    Comment by Jim Eager — 5 Feb 2008 @ 2:02 PM

  35. Ray (20), Interesting; thanks.

    Comment by Rod B — 5 Feb 2008 @ 2:09 PM

  36. re #33 (Harold Pierce Jr) “The bottom line is this: There never ever will be a reduction in the consumption of or the phase out fossil fuels and consequently no reduction in the emission of greenhouse gases.”

    Pretty big claim, there, Harold, don’t you think? What assumptions are you *really* making? Over what timescales? And what uncertainties are there around your assumptions, I wonder?

    re # 20 (Ray Ladbury) “This would allow models of different complexities to be compared and ensure that overfit models were downweighted approproately.”

    Ray, would you just clarify for me what you mean here by ‘overfit’? I may be confusing this with ‘overfitted’ i.e. the possibility that the underlying model functionality is more complex than necessary to explain the variance and trends in the system. (It’s a nice problem to have to untangle). I’m also interested in the weighting procedure, and to what extent this applies to functional parameters (process rate coefficients, exponents, thresholds, that sort of thing) as well as models as a whole. Any core refs here, for example?

    Comment by Nick O. — 5 Feb 2008 @ 2:20 PM

  37. RE #33 [Harold Pierce Jr.] “The bottom line is this: There never ever will be a reduction in the consumption of or the phase out fossil fuels and consequently no reduction in the emission of greenhouse gases.”
    Well, you live and learn. I’ve been thinking the Earth (and therefore the supply of fossil fuels) was finite!

    Comment by Nick Gotts — 5 Feb 2008 @ 2:21 PM

  38. No Ray, I intend nothing clever. Indeed, the question is pretty straightforward and requires no inference as to intention. Just an answer as to the exposition of the supposed temperature increase for the doubling of atmospheric CO2. If you don’t have the answer just say so.—I can not understand that you do not understand.
    Dr. Schmidt has been kind enough to advise that this is a “best guess”, which is ok if we are gambling with our own money. But when the gamble is with the collective pocket books of the entire planet, I posit we need better than a “guess”.—A “guess” by the way that is not clearly stated anywhere in the IPCC literature of possible doomsday scenarios that, according to the IPCC, have I very high probability of occuring—–that does not sound like a “guess” to me.
    So, Ray, when you ask for others their “exact” information, why is it so hard for you to reference yours?

    Comment by Gaelan Clark — 5 Feb 2008 @ 2:23 PM

  39. Harold Pierce #33 is right. The six or seven billion people in the world burn stuff every day of their lives. Do we really think we can regulate them all to change their way of life? If the production of CO2 is going to change the climate, then we ought to prepare for it. It makes more sense to get off the tracks then it does trying to stop the train.

    Comment by Steve Case — 5 Feb 2008 @ 2:50 PM

  40. Harold Pierce Jr (33) — Au contriare, visit

    http://biopact.com/

    to discover the biofuel alternatives being developed for all the uses you mention, except aviation, where 50% biofuel appears to be the goal.

    Comment by David B. Benson — 5 Feb 2008 @ 2:54 PM

  41. Harold Pierce writes:

    [[I could go on on listing all the human activities that will always use fossil fuels because there never will be any suitable and economical subsitutes with requisite physical and chemical properties.
    The bottom line is this: There never ever will be a reduction in the consumption of or the phase out fossil fuels and consequently no reduction in the emission of greenhouse gases.
    ]]

    You do realize the supply of the stuff is FINITE, right?

    Comment by Barton Paul Levenson — 5 Feb 2008 @ 4:15 PM

  42. Gaelan Clark writes:

    [[Dr. Schmidt has been kind enough to advise that this is a “best guess”, which is ok if we are gambling with our own money. But when the gamble is with the collective pocket books of the entire planet, I posit we need better than a “guess”.—A “guess” by the way that is not clearly stated anywhere in the IPCC literature of possible doomsday scenarios that, according to the IPCC, have I very high probability of occuring—–that does not sound like a “guess” to me.]]

    Here are 61 estimates:

    http://members.aol.com/bpl1960/ClimateSensitivity.html

    Comment by Barton Paul Levenson — 5 Feb 2008 @ 4:17 PM

  43. Steve Case writes:

    [[Harold Pierce #33 is right. The six or seven billion people in the world burn stuff every day of their lives. Do we really think we can regulate them all to change their way of life?]]

    Yes. And, more importantly, switch to other sources of energy.

    [[ If the production of CO2 is going to change the climate, then we ought to prepare for it. It makes more sense to get off the tracks then it does trying to stop the train.]]

    Not if the train is five miles away and you have the engineer’s cell phone number.

    Comment by Barton Paul Levenson — 5 Feb 2008 @ 4:18 PM

  44. Re #39 [Steve Case] “If the production of CO2 is going to change the climate, then we ought to prepare for it. It makes more sense to get off the tracks then it does trying to stop the train.”

    In this case, getting “off the tracks”, means getting off the Earth. How do you propose we go about it?

    Comment by Nick Gotts — 5 Feb 2008 @ 4:24 PM

  45. [I’ve been thinking the Earth (and therefore the supply of fossil fuels) was finite!]

    Well, there is some dispute about that. Fossil fuels may not be the correct term for oil as we know it and it may not be finite.

    http://planetgore.nationalreview.com/post/?q=OTY0NzIzMzQ0YTA1NWJkOWQ1ZmYwNzE2NTU1YjQ2Mjg=

    [Response: This is a typical over-reaction. It is well known that not all hydrocarbons are biogenic (methane on Mars and Titan anyone?). Showing that this can be seen on Earth is a long way from showing that all oil is produced that way (it's not), let alone that it implies that supplies are effectively infinite. - gavin]

    Comment by bert — 5 Feb 2008 @ 4:34 PM

  46. Re #43 Barton,

    If you have got the engineer’s cell phone number could you please let me have it so I can ring him and get him to stop?

    Of course if that is the number of the White House then I have to warn you that it does not seem to work. They tell me there is a speed fiend in the cab who will stop for no one :-(

    Comment by Alastair McDonald — 5 Feb 2008 @ 4:45 PM

  47. Is there such a thing as a point of diminishing returns in using meta ensembles-kind of a Tower of Babel effect?
    Will we go from storage needs of 100s terabytes to quadrillion bytes and higher? This feels like it can lead to complex logistical problems.

    Comment by Lawrence Brown — 5 Feb 2008 @ 4:48 PM

  48. B Buckner (#24) wrote:

    The ocean surface temperatures go up and down (at a smaller magnitude) at the same time as the land surface temperatures. I see no evidence of a lag going back to the beginning of the record in 1880. Why is this?

    Ocean surface temperatures go up and down at a smaller magnitude in sync in the short-run due to their greater thermal inertia. They are being warmed or cooled at the same time, but given a general, long-term warming trend will take longer to warm up to the point that the earth’s system achieves radiation balance — with their warming pushing the radiation balance a little further away due to increased water vapor pressure coming off their surfaces leading to a further enhancement of the greenhouse effect due to the water vapor content of the atmosphere.

    But the thermal inertia is just part of the problem. There is also ocean circulation, primarily in terms of the thermohaline but also in terms of ocean waves, which redistributes the heat to deeper waters. In the case of thermohaline circulation, it takes an individual molecule take on the average 3500 years to make a complete circuit,. It will take a great deal of time for the ocean to achieve the eventual quasi-equilibrium heat distribution – which is why the last 20% that Gavin inlines about takes so long.

    Comment by Timothy Chase — 5 Feb 2008 @ 5:03 PM

  49. gavin,

    I thought I would let you know that the ice coverage for the Arctic Ocean is back to normal. I knew you were concerned about it last month, but now we can all sleep soundly, knowing the polar bears will last a few more months, at least.

    http://arctic.atmos.uiuc.edu/cryosphere/

    [Response: Only if your definition of normal is 1 million km2 less than climatology. - gavin]

    Comment by Russell — 5 Feb 2008 @ 5:22 PM

  50. Re: #41

    You do realize the supply of the stuff is FINITE, right?

    And if the “stuff” weren’t, then the oxygen to burn it is. Even ignoring for the sake of argument our own breathing needs.

    (Reductio ad absurdum… the Earth will become uninhabitable in any one of a broad palette of ugly ways, long before even getting close to this limit.)

    Comment by Martin Vermeer — 5 Feb 2008 @ 5:33 PM

  51. Re Steve Case @ 39 “If the production of CO2 is going to change the climate, then we ought to prepare for it. It makes more sense to get off the tracks then it does trying to stop the train.”

    You “it’s inevitable and we can adapt” guys keep forgetting that a warmer climate is only one result of increased CO2. It will also result in a more acidic ocean and a greatly altered marine food chain, so “adapting” also means that part of the Human population that relies on seafood for its protein source will have to find something else or do without.

    Comment by Jim Eager — 5 Feb 2008 @ 6:48 PM

  52. Gaelan, I will take the best guess of the climate scientists–since it is based on evidence–over, say, yours, which is not. The fact of the matter is that if you have a climate sensitivity much less than 3 degrees per doubling, there are a lot of paleoclimate and other results (e.g. effects of volcanos, the “cooling” of the mid 40s to mid 70s) become quite difficult to explain. Perhaps you are unfamiliar with the way a scientist “guesses”.

    Comment by Ray Ladbury — 5 Feb 2008 @ 8:47 PM

  53. Regarding train analogies, I suggest that the old movie “Runaway Train” might be appropriately cautionary.

    But, given this is about archiving, I applaud those who are doing it: it’s hard work, and is a (mostly) thankless task.

    Comment by John Mashey — 5 Feb 2008 @ 8:50 PM

  54. Trying not to be too off-topic, but I’ve been wondering this for awhile and this most recent effort by Gavin includes some potentially relevant material:
    “that is, the weather pattern on Jan 31, 1967 in one realisation will be uncorrelated to the weather pattern on Jan 31, 1967 in another realisation, even though each run has the same climate forcing …. … there is certainly a degree of unforced variability even at decadal scales (and possibly longer).”
    This makes sense to me. But relating it to a nagging question about initial conditions, etc, after we obtain enough observations subsequent to Jan 31, 1967, is it possible to infer the initial conditions? That is, is it possible to get enough historical data such that the unforced variability can be incorporated and all models of future climate change can have the same starting point?
    Now you know why I haven’t asked this question before. Perhaps I’ll find a way to ask it intelligibly in the future.

    Comment by Steve L — 5 Feb 2008 @ 8:55 PM

  55. Hi Nick, Overfitted is probably a better way to say it as “overfit” conjures a picture of a model on steroids.

    The weighting procedure works for “statistical models” more than dynamical models, but you can infer parameters in dynamical models. Here’s an interesting reference:
    http://www2.fmg.uva.nl/modelselection/presentations/AWMS2004-Burnham-paper.pdf

    I’ve been looking at some of this stuff in assessing reliability of microelectronics in high-radiation environments of space.

    Comment by Ray Ladbury — 5 Feb 2008 @ 9:00 PM

  56. Re: #49

    Russell says: “… ice coverage for the Arctic Ocean is back to normal. …”

    Russell, the ice area has obviously increased during the winter, but it’s still 1 million square kilometers lower than the 1979-2000 average. And, more importantly, the new ice is much thinner than the old ice (the thick ice that had been there for many years), so that next summer the melting will be faster and deeper. We are nowhere back to normal, even if the area had gone back to the 1979-2000 average, and you shouldn’t feel so good about it. Even if we had an average area of very thin ice, that would still not be good enough, because this young ice would melt very quickly next summer. You can only feel good about the polar bears if we go back to 79-00 average coverages for many years in a row to guarantee that the thick, multi-year ice covers the same area it did before.

    Comment by Rafael Gomez-Sjoberg — 5 Feb 2008 @ 9:37 PM

  57. I thought I would let you know that the ice coverage for the Arctic Ocean is back to normal. I knew you were concerned about it last month, but now we can all sleep soundly, knowing the polar bears will last a few more months, at least.

    Someone else has discovered winter. It’s amazing how many people were unaware of it until this year. Even more amazing is how so many denialists seem convinced that CLIMATE SCIENTISTS are unaware of winter.

    Comment by dhogaza — 5 Feb 2008 @ 10:06 PM

  58. #49, It is not “normal” , one word demonstrates this: volume. Another is plainly visible to anyone with remote sensing knowledge, there are more leads amongst a vastly wider area of first year ice. “First year” ice is vulnerable to re-melting. Much more so than ice twice or thrice thicker. The big question is whether immediately under ice biological and inorganic gaseous feedback is unleashed right now with little effect in darkness, or whether there will be a massive release this spring when sunshine triggers photochemical cloud and fog coverage, reducing the impact of sunlight. The latter scenario will be the only saving grace from a greater melt.

    High Arctic sunrise from the long night just happened, in very cold surface air, but aloft there is a thick layer of warm air, giving a weak refraction boost, from over the ice of Barrow straight. Its too early to say how strong and pervasive this warming is, but similar weak refraction starts were 2005 and 2007.

    Comment by wayne davidson — 5 Feb 2008 @ 10:12 PM

  59. Gavin,
    Thank you; Meehl, et al (Science 307, 1769) gives exactly the comparison of transient responses I was looking for.

    Along similar lines, could somebody refer me to a good paper on the observed radiation imbalance? I’m seeing that it’s on the order of 1 W/m^2; this is relatively small. Can it be measured with confidence?

    Comment by tharanga — 5 Feb 2008 @ 11:50 PM

  60. wayne, how does it look from your high vantage point? The recent satellite photos of ice breaking up in the Beaufort sea were quite unsettling.

    Comment by Nick Barnes — 6 Feb 2008 @ 5:07 AM

  61. Re #49

    I thought that climate models predicted the loss of summer sea ice and not winter sea ice for many years to come. Any loss of winter sea ice would be worrying, after all it is dark for 3/6 months, ice is boudn to form in open water.

    Comment by pete best — 6 Feb 2008 @ 5:13 AM

  62. Re #57

    Careful where you point that sarcasm! The recent improvement in the NH sea ice anomaly is not just typical winter re-growth, because the 1978-2000 baseline for the anomaly includes seasonal variation – it’s not just a constant value. That’s why, if you look at the “tale of the tape” plot, you don’t see the huge seasonal of ~ 10 million square kilometers.

    (That said, the arguments about thin ice being at risk next summer seem quite plausible to me. I’ve noticed river basins can appear to have completely frozen over in a 24 hr span, but that ice can be lost a lot faster than cover that’s a result of many days or weeks of cold.)

    Comment by David Warkentin — 6 Feb 2008 @ 8:04 AM

  63. RE#49
    And an other comment on ice extend in winter. Of course the extend is back to ‘normal’. There is an other reason why the anomaly in winter extend is less then in summer. The extend is limited by land. Without Siberia and Canada maybe the extend would have been 19 mln km2 in the old days, and now 15 mln in a warmer climate. But due to the land mass these extends are cut off to less then 14. Have a look at the ‘regional graphs’ at cryosphere.

    Comment by Hans Kiesewetter — 6 Feb 2008 @ 8:27 AM

  64. #61 Pete Best,

    “I thought that climate models predicted the loss of summer sea ice and not winter sea ice for many years to come.”

    You’re right, Micheal Winton of GFDL has done an interesting paper on the stability of a seasonal Arctic ice cap in models. From his homepage: http://www.gfdl.noaa.gov/~mw/ it’s the pdf under “New Publication” – Sea ice-albedo feedback and nonlinear Arctic climate change.

    As an aside…
    Cecilia Bitz has some conference slides here: http://dels.nas.edu/basc/CRC0507/CRC%20Presentations%2011-07/Bitz_NRC_Nov07_corr%5B1%5D.pdf
    Bearing in mind I’m working from the pdf – I have not heard the lecture…

    Bitz argues that next year’s minima will be around the established linear trend for 1979-2006 (page 12), in other words last year (2007) will not be the start of a new trend.

    On page 11, she shows a negative correlation for a 1 year lag.
    What’s being done here is that the data series for detrended September extent minima is correlated (checked for matches) against itself, with a lag applied. So where there’s no lag (lag=0) you get a correlation of 1 because you’re correlating the sequence with itself, but for a 1 year lag you get almost -0.2, i.e. no match.
    This can be seen on page 9 where successive record minima (circled in blue) since 1979 have always had more than 1 year between them.

    Dr Bitz concludes that:
    1) “PDF of anomalies becomes wider as ice thins, so expect bigger anomalies in future.”
    2) “Zero one-lag autocorrelation, so no reason to expect big anomaly in 2008 (in my opinion)”
    Which seems reasonable given her argument.

    However as I’m looking at perennial ice as a damping factor on ice albedo feedback, I find myself wondering if the perennial ice is now so reduced that such reference to past behaviour is no guide to the future. The QuikScat images seem to show a significant reduction in thick ice between this year and the previous 2. e.g. bottom of this page, 4 Jan 2006, 2007 & 2008. http://ice-glaces.ec.gc.ca/App/WsvPageDsp.cfm?id=11892&Lang=eng
    If I had to bet, I’d go for a summer minima of below 5 million km^2, either way we’ll know by October 2008.

    That said,
    Cecilia Bitz is an accomplished physicist, expert in modelling in the polar environment.
    I’m just a bloke with a degree in electronics…
    ;)

    Comment by Cobblyworlds — 6 Feb 2008 @ 8:55 AM

  65. “I thought that climate models predicted the loss of summer sea ice and not winter sea ice for many years to come. Any loss of winter sea ice would be worrying, after all it is dark for 3/6 months, ice is boudn to form in open water. …”

    It’s called winter for a reason. I guess I’m not getting it, but pointing out how much ice reforms in the Arctic in the wintertime seems just plain silliness.

    Want a man-sized bet, bet on whether perennial ice recovers or declines.

    Comment by JCH — 6 Feb 2008 @ 9:35 AM

  66. In the subject post, Gavin states:

    “……… So any comparison of climate models and data needs to estimate the amount of change that is due to the weather and the amount related to the forcing. In the real world, that is difficult because there is certainly a degree of unforced variability even at decadal scales (and possibly longer).”

    Should the daily, seasonal, or annual weather be divorced from the forcing in this way? After all the the daily, monthly and annual averages and extremes of weather data, like temperature, precipitation, ice and snow accrual or lessening, taken over many tens of thousands of days are what climate is composed of.
    Many of the weather events we’ve been experiencing, especially in the last few decades likely been so intertwined with forcing, that it appears that it’s not only difficult to separate natual variability from forcing but something beyond our present grasp.

    Comment by Lawrence Brown — 6 Feb 2008 @ 9:58 AM

  67. re: #57 & #58

    My understanding of an average means that some examples will be less and some more. If an element is less than average, yet still well within the bounds of “normal”, it is not a cause for concern.
    My point to gavin was that we have a poor understanding of what constitutes an “abnormal situation” and what constitutes “less than average, yet still within historical norms”. If we had 5000 years of arctic sea ice records, it would be pretty obvious. We have about 40 years, with no records pre-AGW.
    A reduction in the average over time, is more where the truth lies, but that will require some patience. Something in very short supply.

    Comment by Russell — 6 Feb 2008 @ 11:03 AM

  68. Re: Sea Ice

    In fact we do have data on sea ice prior to the satellite era; it support the hypothesis that it’s been in serious decline in the northern hemisphere, and in decline in the southern hemisphere as well. I posted on the topic here.

    Comment by tamino — 6 Feb 2008 @ 11:53 AM

  69. Sorry Gavin,

    Multitasking day at the office. Could you expand on your reply to the doubting Thomas? Thanks.

    Re: Only if your definition of normal is 1 million km2 less than climatology.

    I’m thinking of coining a few more bon mots to Chris Booker.

    http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2008/02/03/nbook103.xml&posted=true&_requestid=365933

    Best regards

    Mike

    Comment by Mike Donald — 6 Feb 2008 @ 12:26 PM

  70. Tharanga–

    I think the information on radiative forcings you want is here: http://data.giss.nasa.gov/modelforce/

    I used those, modified to included monthly data on stratospheric aerosols after volcanic eruptions, to fit to a simple energy zero dimensional, transient energy balance model to data. I get plots like the one shown here.

    It’s not entirely clear what my fits mean since the zero-dimensional model is a huge simplification. (It’s the same model Schwartz 2007 used in his paper.)

    Still, assuming something can be learned from this fit, (and I don’t find any blunders) my current results suggest the climate time constant for the climate based on temperature measurements at the planet’s surface, is about 8-10 years. (No errors analysis done, yet. I’m also doing some consistency checks.)

    If correct, this would suggest that if we stopped adding CO2 or other GHG’s to the planet now, and there were no major volcanic eruptions or other dramatic events, the surface would reach 80% of the equilibrium value in roughly 16 years. (I can’t remember off hand how high it climbs; it’s enough we’d notice. )

    Comment by lucia — 6 Feb 2008 @ 12:47 PM

  71. #67 Your understanding is flat wrong, what is there not to comprehend? Less ice volume, means more open water in the summer. What is normal was dictated well beyond modern surveying days, the Bowhead whales of the Atlantic and Pacific are genetically distinct, meaning that the ice barrier between the North Atlantic and Pacific spans multiple thousands of years. Shorter time span, brings us back to the 16th century, when first known attempts to find the North East and North West passage were done, none successful till a very smart Norwegian, Amudsen on the Fram slowly found the Northwest passage in the early 1900′s, after 2 years froth with danger and boredom. Now a days the same path can be done in a couple of weeks.

    Comment by wayne Davidson — 6 Feb 2008 @ 2:24 PM

  72. #69 Mike Donald,

    If you’ll forgive me for butting in to take the load off someone busier than I…

    1) From Cryosphere Today: http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.area.jpg
    Looking at that graph shows a preponderance around 14million Km^2 with a sag in winter maximum to around 13million Km^2 since 2003. i.e. 1million km^2 below the previously typical extent.

    2) Seasonal trends are here: http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/seasonal.extent.1900-2007.jpg

    Chris Booker….
    After the day I’ve had at work I needed a good giggle, thanks for that link Mike. :)

    Cryosphere Today Homepage: http://arctic.atmos.uiuc.edu/cryosphere/ source of the other links in this post.

    Comment by CobblyWorlds — 6 Feb 2008 @ 3:33 PM

  73. In 62 David writes “(That said, the arguments about thin ice being at risk next summer seem quite plausible to me. I’ve noticed river basins can appear to have completely frozen over in a 24 hr span, but that ice can be lost a lot faster than cover that’s a result of many days or weeks of cold.)” This has nothing to do with climate change, but may be of interest. If I have interpreted what David means, let me offer an explanation. When fresh water ice freezes, the ice crystals are vertical. When the ice thaws, some of these vertical crystals tend to stay intact. If you examine fresh water lake ice just before break up, the ice can be up to a foot thick, and at a distance appears to be solid. However, on close examination the ice is “rotten”; a whole series of vertical crystals with nothing much holding them together. In Canada, the lake where I have my cottage typically is completely ice covered one day. Then for some reason a small break in the ice occurs. The wind unsettles the water, and in a matter of a few hours, the entire lake is ice free, with a few chips of ice around the shore. The very rapid break up of fresh water ice in the spring is a very well known phenomenon. HTH.

    Comment by Jim Cripwell — 6 Feb 2008 @ 4:27 PM

  74. Re 56. Is there any reference for the assertion that “new ice” melts faster than old? One could also think that, as old ice is dirtier and more porous than new, it would melt faster. (As we know, the old ice is on top, so it is first exposed to the sun.)

    It is also possible that the age of the ice is not as important as the density of the ice, which is largely determined by the speed of freezing and wind conditions during freezing. But of course, the decisive factor is total thickness: the thicker the cover, the better it withstands summer.

    But I may be quite wrong. Could somebody from the RC group could tell us something about how the models handle these variables?

    Comment by Dodo — 6 Feb 2008 @ 4:27 PM

  75. Re #41 There are about 15 trilion barrels of oil equivent in heavy and extra heavy crude oils, oil shale, and tar sands. Coal can be readly converted to liquid hydrocarbons. South Africa gets about 40% of its liquid hydrocarbons from coal. Google SASOL.

    World supply of oxygen: Over every square foot there is a column of oxygen that weighs about 440 lbs. There is additional oxygen dissolved in water. We will never ever run out of oxygen, which by the way is a renewable resource.

    To all skeptics and deniers of fossil fuel facts and reality, I stand by my post (#33).

    Comment by Harold Pierce Jr — 6 Feb 2008 @ 5:27 PM

  76. Re Gavin’s comment in 23
    I consider SLR from thermal expansion to be De minimis. We can engineer our way around a meter/century SLR. It would not be cheap or pretty, but it is doable.

    The real questions are about sea level rise from permafrost melt and ice sheet dynamics. How fast and how soon could these occur?

    A seasonally ice free Arctic Ocean can be expected to provide more latent heat for rain in permafrost areas which could accelerate permafrost melt. Moreover, other Arctic surface water could drain to the ocean through pseudo karst formed by melting permafrost. In a warming Arctic, this could be a rapid process. And, as late as last November, broad swaths of Greenland received rain. That astonished (and worried me) much more than the sea ice melt last summer. I have done a bit of ice climbing, and our rule of thumb was always, “Ice that gets rained on, falls apart.”

    I am starting to see Arctic storms that sit on boundaries between (warm) open water and ice, I conclude that these storms are driven by the temperature differential between that water and the ice. This provides an effective (but local) mechanism for the transfer of heat from the ocean to the ice. It is good physics, but I do not see it in the GCM. Then, that heat can be rapidly advected into the ice. Given the complex phase transitions of ice under high pressure as it approaches 273K, there is the potential for rapid, progressive, structural failure of big ice by mechanisms not contemplated in the GCM.

    In terms of percentage of volume, how much of the Arctic sea ice has melted in the last decade? Show me a calculation that demonstrates that a combination of unusual weather and changing ocean currents (as in the recent Arctic) could not produce similar percentage of volume change in WAIS in the next decade. Only a fraction of a degree of warming would be required to move from “stabile” to melting. A hint of the current situation is at (http://www.osdpd.noaa.gov/PSB/EPS/SST/data/anomnight.2.4.2008.gif, which shows water in the area warmer than it was in 1979-2000. Such a change in WAIS volume would change sea level.

    I worry when I see models (i.e. University of Maine) that suggest that much of the base of WAIS is at 0C. I know that under pressure, ice melts at less than 0 C. What is required to raise the temperature of some fault plane in the ice some fraction of a degree and have a chunk of ice crumble, and flow into the sea? At some point, as the ice warms, the potential energy of the ice itself will provide the activation energy. It will be sudden. Will it be a surprise?
    ( http://nsidc.org/iceshelves/larsenb2002/animation.html )

    Gavin, would you swear by your FORTRAN manual that these issues have been investigated, and a progressive collapse of any ice sheet could not occur in the next 30 years? Would you like to show me a good model of “pseudo karst formation in permafrost structures” in your Archive? Would you say that NOW, we have a much better understanding of ice sheet dynamics than we had of Arctic Sea Ice melting in the fall of 2004? All known factors considered, I do not think we have a good estimate of how fast sea level change is likely to occur. The fact that the literature on the topic is De minimis does not give me confidence. I would love for you to remind me what an ignorant amateur I am by pointing me to the literature. Come on Hank, tell me my Google is brain dead, and that there is in fact, a big pile of recent literature on dynamic ice sheet modeling that I am just not finding. Please tell me that I am miss-reading ( http://ralph.swan.ac.uk/glaciology/adrian/luckman_GRL_feb06.pdf )

    “The period of continued warming and thinning appears to have primed these glaciers for a step-change in dynamics not included in current models. We should expect further Greenland outlet glaciers to follow suit.”

    (And, that does not even allow for the kind of serious rain that Greenland has seen in the last couple of years. My Google still finds a few weather reports.)

    In short, unless the summer rains in Greenland cease promptly, and the Arctic Ocean returns to its old levels of sea ice, and the waters around Antarctica cool rapidly, we must consider the possibility of rapid (more than 10 cm/decade) sea level rise in the near future (30 years.) Frankly, I expect very rapid SLR, and sooner rather than latter. Absence of prediction in the literature does not mean it cannot happen.

    Comment by Aaron Lewis — 6 Feb 2008 @ 5:51 PM

  77. Aaron, the modelers say, quite credibly, that they can’t model this area without actual physical observations, and the International Polar Year scientists are out there on the ice and oceans right now, frantically busy collecting observations.

    The old assumption that nothing was going to change is clearly wrong, but I doubt that lets the modelers change anything in particular in a useful way.

    I’d love to be surprised, I’m sure work’s going on that amateur readers like me don’t know about. I’m sure Dr. De Quere et al, and others working on the ecology/biology/climate models, have a lot of input that depends on sea ice seasonal changes, for example

    Comment by Hank Roberts — 6 Feb 2008 @ 6:28 PM

  78. http://www.sciam.com/article.cfm?id=the-unquiet-ice

    Comment by Hank Roberts — 6 Feb 2008 @ 6:30 PM

  79. Harold Pierce,
    No one denies the usefulness of fossil fuels. In fact, one could quite easily argue that they are too valuable to burn. However, I take issue with your contention that things must always be the way they are now. Brazil has come a long way with biofuels, for instance. In the ’50s one would have argued that farming in the desert was fantasy, but Israel and other countries have made the desert bloom with desalinized water.
    If ever increasing fossil fuel consumption equates to the end of civilization, one would expect rational people to work toward alternatives. That we may fail in that effort is not indicitave of the impossibility of that task, but rather of the lack of rationality in our species.

    Comment by Ray Ladbury — 6 Feb 2008 @ 8:18 PM

  80. “GISS received no additional money over our standard model development grants to do simulations for AR4, and most groups were in the same boat.”

    I’m interested in some further clarification of this concept of “unfunded” work on simulations for AR4.

    Perhaps this question is totally naive, but what is US funding of GCM development for, if not for contribution to IPCC assessment reports? Don’t US science agencies who fund this activity fully expect these models to be used for AR4? And if they do not, why should they tolerate the IPCC dictating how their expensive computing facilities will be used?
    Thanks

    Ryan

    [Response: Funding is for general research into climate forcings, responses, processes etc. The US groups were not mandated to do simulations in support of AR4, but decided to do so because they felt that would further the research. I think that was definitely the right decision. If AR4 hadn't come along, we would have probably done similar kinds of things, so adapting to their specifications wasn't completely orthogonal to our research goals. - gavin]

    Comment by Ryan — 7 Feb 2008 @ 12:39 AM

  81. Very interesting article..scientists by nature are a very pedantic lot.that’s their strength but also their weakness as well. By the time we get enough data that statistically show to all and sundry that the level of forcing as apposed to natural variablty is beyond doubt the party’s over, the spilt beer and broken champagne glasses has been cleaned from the floor the last of the partier’s has dozed off in the corner and the cockroaches are out to play. It’s the same ol’ boiling frog scenario over and over again. It’s up to the climate scientists to take an avanced course in assertiveness and scare the pants off their respective ministers and senators into immediate action. Also to keep the pressure on the world media to keep reporting climate change stories. What do you guys think?

    Comment by Lawrence Coleman — 7 Feb 2008 @ 3:03 AM

  82. re #79

    Its not the end of civilization that matters Ray but the warming of the globe that seems to be the issue. In fact it is not doubted anymore that the end of cheap oil is upon us (its peak) and that we must seek out the more expensive heavy oils now in order to meet world demand.

    By heavy oil I mean the Athabasca oil sands of Alberta, a collossal resource of upto 2.5 trillion barrels of which current some 200 billion barrels is known to be recoverable, after that it needs new technology and its get more expensive to deliver it. The same goes for the potential 1.2 trillion barrels of heavy oils in Veneuzla and there are some more in Brazil and the USA. If these resources come online then expect much more warming of the planet beyond 2C.

    Projected oil consumption of 115 million barrels per day is required by 2030 (2% annual increase) and the likelihood of this being met by conventional oil resources of not good as it peak is here. Therefore only unconventional heavy oils can potentially meet this demand. Biofuels possibly can help out but not the way they do it in Brazil from growing sugar cane, it will not scale to global proportions. Switch grass and algae methods of the current second generation leaders but they will not come online any time soon in any great quantities to stop the oil companies from attempting to dig up Alaska, Iraq, persuading OPEC to pump more etc.

    Comment by pete best — 7 Feb 2008 @ 5:52 AM

  83. #72 CobblyWorlds

    Ta for those excellent links. I especially like the side by side comparison of present and previous ice coverage. Anyrate the key parameter should be ice mass – not areal extent. I reckon a remake of Ice Station Zebra will very very boring. [smiley face]

    Comment by Mike Donald — 7 Feb 2008 @ 6:01 AM

  84. Re #75, 15 trillion, that is just a fugure you have pulled out of the air regarding oil sands and heavy oils. Its 5 trillion at most, 2.5 in Athabasca, 1.2 trillion in Venuezla (orinoco belt) and 1.5 else where in the world unless you know some more massive deposits unknown at the present time.

    Comment by pete best — 7 Feb 2008 @ 6:20 AM

  85. Re ice stability: a quick and dirty calculation* seems to indicate that thinner, one year ice will allow a larger phyto plankton population beneath it. Breaks in the thinner ice will allow DMS produced by that higher population to leak out and form more fogs and low level cloud, increasing albedo and preventing excessive melting. Very Gaia.

    I hope someone is checking DMS levels.

    JF
    *= guess

    Comment by Julian Flood — 7 Feb 2008 @ 8:09 AM

  86. Lawrence,

    You wrote – “It’s up to the climate scientists to take an advanced course in assertiveness and scare the pants off their respective ministers and senators into immediate action. Also to keep the pressure on the world media to keep reporting climate change stories. What do you guys think?”

    I certainly agree that the climate scientists should take an advanced course in assertiveness, but trying to scare the politicians has been tried and failed. The politicians only answer to their electors. It is the public that has to be convinced. When the voters speak then the politicians will jump.

    The IPCC has been producing reports for nearly 20 years now but what effect have they had? In 1990 when the first IPCC report was issued oil consumption was 66.6 million barrels per day. It is estimated that in 2010 it will be 91.6. Yet the AR4 is still begins with a “Summary for Policymakers.” The scientists just don’t seem to realise that it is not the politicians but the public who decide how much oil they is burnt, even if they are not conscious of it. The politicians cannot stop their constituents flying off on vacation nor jet skiing when they get there, and still get re-elected.

    As long ago as 1989 Stephen Schneider wrote:

    “On the one hand, as scientists we are ethically bound to the scientific method, in effect promising to tell the truth, the whole truth, and nothing but — which means that we must include all the doubts, the caveats, the ifs, ands, and buts. On the other hand, we are not just scientists but human beings as well. And like most people we’d like to see the world a better place, which in this context translates into our working to reduce the risk of potentially disastrous climatic change. To do that we need to get some broadbased support, to capture the public’s imagination. That, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have. This ‘double ethical bind’ we frequently find ourselves in cannot be solved by any formula. Each of us has to decide what the right balance is between being effective and being honest. I hope that means being both.” (Quoted in Discover, pp. 45–48, Oct. 1989; for the original, together with Schneider’s commentary on it misrepresentation see also American Physical Society, APS News August/September 1996. [my emphasis]

    http://en.wikipedia.org/wiki/Stephen_Schneider

    In the most recent IPCC AR4 the true extent of the sea level rise has been suppressed. The rise due to melting ice has been omitted, with a note saying this was because of uncertainty of how much it would be. What they should have done is published the largest estimate, with a note saying this might be an over-estimate since the amount of melted ice was uncertain. That is being perfectly truthful, not being misleading as the AR4 version is.

    But it won’t happen. The scientists won’t admit that their strategy is incorrect. Then they would have to admit that they were wrong, and no-one will do that because it means losing face.

    They will continue to play safe, but endanger the world!

    Don’t you agree Gavin?

    Comment by Alastair McDonald — 7 Feb 2008 @ 11:16 AM

  87. I very much appreciate the summary of the collected models and the discussion that follows. Two issues ocurred to me:
    1. Nowhere was the actual size of data sets specified, except for ‘very large’ and ‘Terrabytes’. I’d like to know the size of an individual data set, group of same model sets, and whole ensemble of open access sets.
    2. The reason for question one is that a number of people, myself included, have > 1T local storage and non-trivial amounts of compute capacity, and are already involved in climateprediction.net, about 50,000 active hosts currently. What is the possibility that this cross-model analysis could be written to run under BOINC control and distributed to many individual systems for a substantial increase in run capacity?
    3. Have you considered asking Amazon or Google to host the open part of the datasets for no charge to assist in this process?
    Three issues ocurred to me, that’s three issues. :-}

    Note: For those interested, I created the PacificNorthwest CPDN team in 2004 as part of my ‘spreading the word’ effort. Currently with 79 members, the team is #15 worldwide. I have given climate presentations at a number of SF conferences and am in the process of updating my out of date web site to include more info.

    Comment by Bill Nicholls — 7 Feb 2008 @ 11:33 AM

  88. Re 85. It looks like we are all just guessing how thick “the ice” might be. (There was no answer from the RC crew to my earlier Q74.) But it is not certain at all that “new ice” is “thinner” than old. It all depends on the severity of the winter.

    Take this Arctic winter as an example: the starting point was very little ice, but freezing in early winter was quite rapid, so one could assume that average thickness by March, when melting starts, is not exceptional, although the ice is “new”.

    Sorry about all those quotation marks – I have no idea what they are supposed to convey. But let’s return to the topic next September.

    [Response: New ice is always thinner than multi-year ice. You can get ~1 meter or so of sea ice growth in a season, which then ridges and compresses to give multi-meter ice by the end of the winter. Multi-year ice is generally thicker - up to 4m - higher in some pressure ridges, but it takes time to build that up. Most of the ice area that melts in summer is thin first year ice. - gavin]

    Comment by Dodo — 7 Feb 2008 @ 11:51 AM

  89. A very interesting discussions and links, thank you.

    Anyone in the know, please:

    Does the “Northern Hemisphere sea ice” statistic include areas outside of the Arctic Sea proper and the Northern Atlantic? Such as the Bering sea, coastal seas off the Asian eastern shores and the Baltic sea.

    Northern Atlantic has been warm this winter, and a constant southwesterly flow has kept the Northern Europe readings some 5 degrees warmer than usual, all winter long. In fact, one of the recent features is the complete absence of colder spells over here. The Baltic remains ice free, which is quite unusual. There is very little ground frost, so I presume our bacterial flora has been busy manufacturing more CO2 all winter long.

    My sporadic checks on a few Siberian sites were mostly warm as well, except Verkhoiansk which has shown the usual -50 – -60 degC range. More to the west, Norilsk for instance did not live up to its reputation (only -25 – -30 degC).

    Arctic seems primed for another large summer melt if the seasonal weather patterns are nearly the same as last year. Most important is the observation that winds have continued to push multi-year ice out to the Atlantic, which means still more first-year ice cover. In addition to being thinner, I believe first-year ice is rather brittle as it includes more occlusions of saturated brine than the slowly formed multi-year ice.

    Comment by Pekka Kostamo — 7 Feb 2008 @ 11:56 AM

  90. ” … Most of the ice area that melts in summer is thin first year ice.” – gavin

    No doubt that is true, but the way it reads sort of masks something.

    When was the last accrual year for perennial ice? From whatever year scientists consider its modern zenith, what percentage of perennial ice has been lost?

    Comment by JCH — 7 Feb 2008 @ 1:25 PM

  91. Lawrence Coleman asks: “It’s up to the climate scientists to take an avanced course in assertiveness and scare the pants off their respective ministers and senators into immediate action. Also to keep the pressure on the world media to keep reporting climate change stories. What do you guys think?”

    Under no circumstances should it ever be to “scare” anyone. While I am also frustrated with the inability of policy makers to at threats that extend beyond the expiration of their term in office, but the only weapon science has is the credibility of its practitioners. It is not inappropriate to do “how bad can it be” analyses, but our emphasis should be on preparing for credible threats rather than just getting people off their fat, complacent asses.

    As Mark Twain said, “If you tell the truth, you’ll eventually be found out.”

    Comment by Ray Ladbury — 7 Feb 2008 @ 1:29 PM

  92. A quick thought in reply to Robert Rohde’s comment about the limits on use — seems to me the Globalwarmingart pages are ideal material for a ‘research project intended for academic publication’ specifically on how information can be presented so as to be understood.

    There are plenty of college first year students each year who have little exposure to scientific information.

    Certainly enough to divide into groups and test comprehension of basic understanding of how the world works. Start with Piaget’s basics (grin).

    Robert, I suggest circulating a memo to the psychology/sociology or even (gasp!) political science and economics departments.

    Someone must be working on — or would like to work on — studying how information can be presented graphically so as to be understood.

    To prepare that material you’d certainly be justified in drawing on the current database, and want to compare new information to prior to see if a graphic can be created that gets across what’s different.

    The very simple questions would include
    – left to right or right to left?
    – changing scales (geologic time, then “last 200 years”) on a chart — do people see it?

    Just saying. It’s the great question of the age, can people comprehend science. It’s got to be done with pictures. You’re it.

    Comment by Hank Roberts — 7 Feb 2008 @ 2:25 PM

  93. Ray,

    You wrote ‘As Mark Twain said, “If you tell the truth, you’ll eventually be found out.”’

    No doubt you will agree that the non-sceptic scientists are telling the truth, but when they are found out it is going to be too late!

    Comment by Alastair McDonald — 7 Feb 2008 @ 2:46 PM

  94. Re # 64

    Traditionally, Arctic sea ice was insulated from the warmer, saltier water below it by several meters of very cold, fresh water that floated on the surface of the salt water below. (see http://www.acia.uaf.edu/PDFs/ACIA_Science_Chapters_Final/ACIA_Ch09_Final.pdf)
    As the multiyear ice melts, this insulating layer is subject to storm mixing, i.e.,the chaotic wave systems of polar cyclones driven by a temperature differential between open water and extant sea ice.

    The open water absorbs heat, and cyclones transfer this heat to the sea ice as rain. Note; last summer there were periods when research activities on the Arctic sea ice were suspended because of rain. Rain in the high Arctic? That was a 6 sigma event. It changes the rules of the game. That should have made headlines everywhere. It did not. The researchers were so busy with their “science” that they did not make a big deal of the really interesting event going on all around them. (After ice is rained on, it absorbs more sunlight and melts faster.)

    Loss of the freshwater insulating layer from the surface of the Arctic Ocean means that the maximum refreeze temperature for new sea ice is (-1.8C) rather than the (0C) when the surface fresh water is retained. Thus, it must get colder to freeze new sea ice. It also means that the first year ice is saltier and has a lower melting point. Thus, first year ice frozen from saltier water will melt much faster than first year ice frozen from less salty water.

    Since 1998 (at least), warm, salty water has been flowing into the Arctic basin from the North Atlantic, and more recently from the North Pacific. This additional heat facilitates Arctic sea ice melt. This additional salt facilitates sea ice melt. According to the Navy guys in Monterey, some of the multiyear Arctic Sea ice has been melting twice as fast from the bottom up as from the top down.

    Measures of ice area do not reflect ice volume that have melted. Much of the extant “multiyear” ice is thinner that it ever was before, and if it gets a bit of rain water on it, it will melt fast. If chaotic wave systems “pump” the fresh water out from under this ice, so that it is floating in warm salt water, it will melt fast. Moreover, from overhead, ice floating in salt water will appear thicker than ice floating in fresh water. We may not have a good measure of the volume of the remaining ice.

    Last, but not least, we have accumulating soot on the sea ice reducing its albedo.

    Once we lose a significant amount of our Arctic sea ice, the system will tend to toward the loss of all summer sea ice in a highly non-linear fashion. Many of the effects listed above are not in the standard sea ice models, ocean current models, or atmospheric circulation modes. (At least I do not see them in there, but then, I do not work with these models and do not know their details.) They are small scale, local effects that occur on boundaries, and they never made it into the GCM. (Again, I do not spend my days with these models.) However, I think that cumulatively, they have a significant effect. With less sea ice and without its insulating freshwater, the Arctic sea is a new system, and statistical trends based on the old system are not likely to be valid.

    A new day dawns in the Arctic, and a lot of ice is about to melt. Unless there are ice flows from the Greenland Ice Sheet, 2×10^6 km^2 is my estimate for 2008 Arctic Sea Ice minimum with general loss of Arctic (summer) sea ice by the end of the 2010 melt season.

    Comment by Aaron Lewis — 7 Feb 2008 @ 3:40 PM

  95. Ref 89 Pekka writes “Does the “Northern Hemisphere sea ice” statistic include areas outside of the Arctic Sea proper and the Northern Atlantic? Such as the Bering sea, coastal seas off the Asian eastern shores and the Baltic sea.” So far as I am aware, ALL sea ice in the northern hemisphere is included. But, for example, I dont think it includes ice on the Great Lakes in the USA and Canada. As to what will happen next September, we are all very much “staying tuned”.

    Comment by Jim Cripwell — 7 Feb 2008 @ 3:56 PM

  96. ” As to what will happen next September, we are all very much “staying tuned”. …” – Jim Cripwell

    Tell me why I’m wrong in believing the reality of what will happen in September is going to be either a new record, significant, or a nonevent, insignificant?

    Comment by JCH — 7 Feb 2008 @ 5:35 PM

  97. #85 Julian, Old ice is like a house, many micro and larger beings live in or just by it, I am not sure if new ice is more fertile than old, it may be a combination of both which is ideal. The coming fog extent will tell.

    #94 Thanks Aaron, really good summary of some of the complexities involved,
    winds play an important role as well. Remains also the question as to whether multi-year ice can rebuild to former expanses, especially when top old ice itself is a source of fresh insulating water, and also makes excellent tea….

    Comment by wayne davidson — 7 Feb 2008 @ 9:45 PM

  98. Aaron Lewis comment #76 6 February 2008 5:51 PM wrote:
    “I see models (i.e. University of Maine) that suggest that much of the base of WAIS is at 0C.”

    please may i have a link, or a citation for these studies ?

    Comment by sidd — 8 Feb 2008 @ 12:14 AM

  99. #89 & #95

    Indeed the sea ice situation here by the Baltic Sea is unusual this winter. I have lived my whole life on the south coast of Finland, and I can’t remember a year with this little ice. The situation varies from year to year, but what I find remarkable is that even Gulf of Riga, Gulf of Finland and most parts of Gulf of Bothnia are still open. Winter lasts another month or so, but according to the weather forecast no further freezing is expected in the coming days, it remains to be seen if there will be any at all.

    Finnish Institute of Maritime Research updates ice situation on their website http://www.fimr.fi/en/palvelut/jaapalvelu/jaatilanne.html

    There’s also interesting comparison to average (or “normal” as they call it) winter.

    Comment by scipio — 8 Feb 2008 @ 6:53 AM

  100. In 96 JCH writes “Tell me why I’m wrong in believing the reality of what will happen in September is going to be either a new record, significant, or a nonevent, insignificant?” You are absoultely right that in the great scheme of things, the amount of ice in the Arctic this September is a completely non-event. However, it is important because the proponents of AGW have said it is important; something about a “tipping point”. I am afraid I dont understand tipping points.

    Comment by Jim Cripwell — 8 Feb 2008 @ 7:12 AM

  101. If this coming Septempber doesn’t set another record this year it most likely will in 2009. Having a run of consecutive record melts is very rare so I think we are due for an adjustment year where the melt mightn’t be quite as severe as the past few years, but the unmistakable trend is taking on an increasingly exponential appearence. Loss of sea ice begets further loss as the sun warms the growing expanse of ‘dark’ sea. Multi year ice is also heavily fractured in compostion with each season’s layering differing from the last..so it is anything but a homogenous mass but rather- rarefied crystalline with heaps of surface area exposed to the warming rays of spring and summer. Am am not a scientist but have a great interest in all things of a scientific bent and by what I can picture is that the artic territory will collapse like a pack of cards..very soon. Thanks Alastair and Ray for giving me further insight. So public education is the best way change governmental views? The media watching public I’ve found also tends to become increasingly cynical and war weary about climate change stuff especially in Australia..the echos of Ho-Hum are getting louder. What does the media do in that regard? Maybe to really include everyone in the street in this fight..which is absolutely true. Make everyone feel important for doing their contributing bit..maybe competitions as to which suburb changes over the most lights to CFL’s etc..ideas like that?

    Comment by Lawrence Coleman — 8 Feb 2008 @ 7:44 AM

  102. re 98
    start at http://www.climatechange.umaine.edu/Research/Contrib/html/15.html

    Comment by Aaron Lewis — 8 Feb 2008 @ 10:11 AM

  103. re 101
    see http://www.uctv.tv/search-details.asp?showID=13459

    The American Denial of Global Warming
    (#13459; 58 minutes; 12/12/2007)
    Polls show that between one-third and one-half of Americans still believe that there is “no solid” evidence of global warming, or that if warming is happening it can be attributed to natural variability. Others believe that scientists are still debating the point. Join scientist and renowned historian Naomi Oreskes as she describes her investigation into the reasons for such widespread mistrust and misunderstanding of scientific consensus and probes the history of organized campaigns designed to create public doubt and confusion about science.

    Comment by Aaron Lewis — 8 Feb 2008 @ 10:15 AM

  104. Jim Cripwell, let me introduce you to the concept of the future tense. In English, the use of the word “will” prior to the clause that contains the verb generally conveys a sense that the events referred to take place in the future. For example, JCH says, “Tell me why I’m wrong in believing the reality of what will happen in September is going to be either a new record, significant, or a nonevent, insignificant?” The “will happen” would imply he is talking about next September, not last September. Since last September represented a record low in sea ice, it can hardly be called a nonevent or insignificant.

    Comment by Ray Ladbury — 8 Feb 2008 @ 11:44 AM

  105. Jim Cripwell, perhaps you should read up on tipping points as to save your bacon, which would entail retracing a decades long march in the loss perennial ice, you may need one.

    Comment by JCH — 8 Feb 2008 @ 11:57 AM

  106. World supply of oxygen: Over every square foot there is a column of
    oxygen that weighs about 440 lbs. There is additional oxygen dissolved
    in water.

    Large, but finite. As I pointed out.

    We will never ever run out of oxygen,

    I agree… we’d be very dead very long before that. But ignoring that little detail, and assuming fossil fuel runs out “never ever”, it would take only 11 doublings à 30 years of the current CO2 concentration anomaly of 0.01%, to completely replace atmospheric oxygen (assumed at 20%; including ocean dissolved oxygen left as an exercise for the reader), i.e., we would be there by the year 2338.

    Mean temperature by that time — or a few decades later at equilibrium — would be 48 degs C [suppressing vivid imagery of going to sauna with an oxygen mask on :-) ].

    That’s for exponential growth. Linear growth gives us more time, but we’ll still run out in the end. The only way to prevent that, is a sufficiently steep exponential decrease. Which amounts to stopping the use of fossil fuels in finite time.

    which by the way is a renewable resource.

    Not on the relevant time scale, a few centuries. It took millions of years to form the current oxygen atmosphere by photosynthesis and deposit the produced organic matter underground, and would take a similar time today to happen again. That’s not remotely an interesting time scale for us.

    (BTW are you the same Harold Pierce Jr that can be observed foaming at the mouth on CA and some other forums? Google remembers. How does it feel to have to appear minimally civilized on RC?)

    Comment by Martin Vermeer — 8 Feb 2008 @ 2:01 PM

  107. 88. Gavin: “New ice is always thinner than multi-year ice”. Quite an assertion, I must say. If there is a thin layer of old ice left after a warm summer, and a new freezing season sets in rapidly, surely the new ice may become thicker than the old.

    89 Pwkka: The Baltic sea is so small that its ice cover changes are insignificant in global and hemispheric ice area calculations. Whatever extra CO2 is emitted from Finnish soil this winter is surely compensated by the CO2 not emitted from China and other Asian locations covered with snow.

    Comment by Dodo — 8 Feb 2008 @ 3:30 PM

  108. re 98
    see http://www.sciencemag.org/cgi/content/abstract/315/5818/1544

    Liquid water with discharge implys a base temperature of 0C. The UM model only offers a visual impression of how large an area is that warm.

    In other work, Ted Scambos discusses the importance of melt ponds to ice shelf colapse. It is worth getting on Google Earth or looking at Modis (http://modis.gsfc.nasa.gov/index.php) and other sites to see just how many melt ponds are forming on various ice structures. Greenland with Google Earth is a particularly target rich environment.

    Comment by Aaron Lewis — 8 Feb 2008 @ 4:01 PM

  109. Re: Dodo @ 107: “New ice is always thinner than multi-year ice”. Quite an assertion, I must say. If there is a thin layer of old ice left after a warm summer, and a new freezing season sets in rapidly, surely the new ice may become thicker than the old.”

    Surely not.

    The existing “thin layer of old ice left after a warm summer” will indeed get thicker, but it is, by definition, multi-year ice, not “new” ice.

    “New” ice is, well, new ice, i.e. ice that did not exist last summer.

    Comment by Jim Eager — 8 Feb 2008 @ 4:08 PM

  110. Re # 74 Dodo: “Is there any reference for the assertion that “new ice” melts faster than old? One could also think that, as old ice is dirtier and more porous than new, it would melt faster.”

    As any general oceanography textbook will tell you, new sea ice has a high salt content, which keeps its melting temperature low (near the freezing point of seawater, roughly -1.8 degrees C). As the ice ages, the salt migrates out, leaving relatively pure frozen water, with a melting point closer to that of pure water, 0 degrees C). Hence, as the temperature warms in the late spring/early summer, the new ice should start melting first. The surface of old sea ice may well be dirtier, but why do you assume old ice is more porous?

    Comment by Chuck Booth — 8 Feb 2008 @ 11:06 PM

  111. #107 Dodo,

    High lattitude warming is continuing: http://data.giss.nasa.gov/gistemp/2007/
    Whether some areas have experienced very cold weather recently is an issue of just that: weather.
    The underlying trend is that of marked warming.

    And permafrost is responding eg: http://gsc.nrcan.gc.ca/permafrost/climate_e.php
    “Not all permafrost in existence today is in equilibrium with the present climate. Offshore permafrost beneath the Beaufort Sea is several hundred metres thick and was formed when the shelf was exposed to cold air temperatures during the last glaciation. This permafrost is presently in disequilibrium with Beaufort Sea water temperatures and has been slowly degrading.”

    Canada is the other side of the world from Eurasia.

    It seems to me that you are trying to dismiss Pekka Kostamo’s observations by employing the technique of ‘directed attention’, as employed by conjurers.

    Comment by CobblyWorlds — 9 Feb 2008 @ 6:31 AM

  112. A time series of Arctic ice coverage is shown on page
    http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/sea.ice.anomaly.timeseries.jpg

    Looking at the 2007 anomaly and scanning the history back to 1978, one might well believe that something quite extraordinary happened in 2007. The range of “normal” variability was exceeded very substantially indeed. A few future years will tell if we have passed one of the tipping points, causing a profound change in the processes that govern the melting and re-freezing over this ocean.

    By the way, I learned somewhere that water entering via the Bering strait has both lower salinity and higher temperature than the water in the Arctic ocean. I could not find a reason for the low salinity, but it was a measurement result from the buoys there. Obviously this helps the summer melting quite substantially (as seen in the satellite imagery) as the Pacific water stays on top. The shallowness of the Bering strait also helps, as the flow is taken from the warmer upper layers.

    How much Arctic sea ice melting there will be is of course also dependent on rather random meteorology, like winds and cloudiness.

    Comment by Pekka Kostamo — 9 Feb 2008 @ 12:44 PM

  113. how sure can you be that temperature measurements are accurate?

    Comment by RK Tandon — 9 Feb 2008 @ 1:45 PM

  114. #113 RK Tandon,

    Section 1.3.2 of Chapter 1 of IPCC Assessment 4 gives a general discussion of the history of this issue: http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch01.pdf

    Comment by CobblyWorlds — 9 Feb 2008 @ 3:37 PM

  115. Sorry if the question is out of place on this thread, but I was hoping someone has worked on studying the relationship of earth quakes and CC.
    Having worked in construction and concrete I have come up with the idea of “crust expansion”.
    When concreting it is important to leave expansion joints within the slab to allow for heat expansion I can’t see how this does not apply to the earths crust. Are Fault lines accentually natural expansion joints? If so the heating of earth crust should indeed increase pressure on these faults and therefore have an effect on increasing quake magnitude and frequency.

    Can anyone verify this concept or point me in the direction of such research.

    Thanks

    Comment by Misanthropic — 9 Feb 2008 @ 7:52 PM

  116. Re # 115 Misanthrope

    I suggest you start by reading an introductory textbook on physical geology.

    Comment by Chuck Booth — 9 Feb 2008 @ 11:14 PM

  117. This is a question for Gavin, related to models but not this specific topic.

    In your post on water vapor (“Feedback or Forcing”), you compare the ModelE calculations to Ramanathan and Coakley (1978).

    In their paper (http://www-ramanathan.ucsd.edu/publications/Ramanathan%20and%20Coakley%20RevGSP%201978.pdf), I don’t see those specific numbers, although Table 6 gives “Relative Contribution of H2O, CO2, and O3 to Outgoing Long-Wave Flux F”

    How do you arrive at the figures in your post? Are they derived from Table 6 (page 17 in the PDF), or are calculated in some other way?

    [Response: They can be calculated from numbers in the paper (with the possible exception of the clouds number), but I first used the values given in a published table whose original source I can't put my finger on right now. It is widely available though (i.e. here). - gavin]

    Comment by cce — 10 Feb 2008 @ 1:42 AM

  118. Misanthropic #115, Climate change tends to affect mostly Earth atmosphere and oceans. Have you noticed that when you dig down, even on a hot summer day, the ground stays cool. The excursions from summer to winter and from day to night will be more than those we experience due to climate change.
    However, the problem with climate change is we’re moving the whole shebang hotter, and that has huge effects on the biosphere, agriculture, etc. One area where you intuition might have some validity would be with what will happen to permafrost. We’re looking at big changes there. Hope this helps.

    Comment by Ray Ladbury — 10 Feb 2008 @ 7:50 AM

  119. Thanks for all the advice about ice, old and new, although I did not get any references to actual scientific literature on ice. But I notice there is a lot of attitude that compensates for any original quotes. “Any textbook…”

    P.S. Cobbly – what do you make of this: http://data.giss.nasa.gov/gistemp/graphs/Fig.C.lrg.gif
    Where has all the warming gone – not so long ago, but around late 2001? Let evidence rule. (If warming returns, I admit I was wrong.)

    Comment by Dodo — 10 Feb 2008 @ 7:53 AM

  120. You can be increasingly sure that temperature measurements are accurate. For instance, for years, there was a disagreement between the MSU data for satellite and ground-based temperature measurements. It’s rather famous; an outside group found a math error; the MSU people confirmed it; the disagreement was due to the math error.

    Now we have two independent data sets in agreement.

    http://www.ipsl.jussieu.fr/GLACIO/hoffmann/generalscience/MSUandSurfData.jpg

    Comment by Hank Roberts — 10 Feb 2008 @ 5:41 PM

  121. Tamino explained that to you at length already, remember?

    Comment by Hank Roberts — 10 Feb 2008 @ 5:47 PM

  122. Totally offtopic but what has happened the “recent comments” and “with inline responses” indices on the right navigation bar over the last week or so??

    Comment by Phil Scadden — 10 Feb 2008 @ 6:52 PM

  123. # 119 Dodo

    Try this: The Oceans: Their Physics, Chemistry, and General Biology, by H.U. Sverdrup, M.W. Johnson, and R.H. Fleming. It is old (1942), but a classic, and the basic physics of sea ice haven’t changed much, I’m sure.
    http://content.cdlib.org/xtf/view?docId=kt167nb66r&brand=eschol

    Refer to Chapter III, section on sea ice
    http://content.cdlib.org/xtf/view?docId=kt167nb66r&chunk.id=d3_6_ch03&toc.depth=1&toc.id=ch03&brand=eschol&query=0

    If you want something more recent, do a Google search using “sea ice” you come up with plenty of references from reliable sources, such as:

    Sea Ice – Wikipedia
    http://en.wikipedia.org/wiki/Sea_ice

    National Snow and Ice Data Center at the Univ. of Colorado at Boulder
    http://nsidc.org/sotc/sea_ice.html

    NASA’s Jet Propulsion Laboratory
    http://southport.jpl.nasa.gov/polar/iceinfo.html

    NOAA’s Arctic Theme Page
    http://www.arctic.noaa.gov/essay_wadhams.html

    Finding information really isn’t that much work.

    Comment by Chuck Booth — 10 Feb 2008 @ 7:24 PM

  124. #119 Dodo,

    1) I have already answered your previous raising of that graph here: http://www.realclimate.org/index.php/archives/2008/02/ipcc-archive/index.php?p=526#comment-80072

    2) I have already answered your previous raising of the claimed cessation of warming here: http://www.realclimate.org/?comments_popup=526

    If you are waiting until 2015 why are you raising the issue again?

    You do not appear to have any new information about this, what you are saying suggests to me that you need to avail yourself of Chuck Booth’s helpful list in post #123.

    Comment by Cobblyworlds — 11 Feb 2008 @ 6:47 AM

  125. #118Thanks for your help.
    Ray said:
    “However, the problem with climate change is we’re moving the whole shebang hotter, and that has huge effects on the biosphere, agriculture, etc.”

    The whole shebang hotter must have a slow effect on expansion?
    If more heat is trapped then the final resting place for that energy must be the earths crust?
    The oceans and atmosphere must slowly transfer heat to the crust?
    Rocks at and near the surface must heat and expand?
    I understand that there are more pressing maters concerning CC that will effect us more and sooner but there must be some effect.

    Comment by Misanthropic — 11 Feb 2008 @ 9:01 AM

  126. Misanthropic, the change due to anthropogenic CO2 will be less than the change from winter to summer or day to night. If we don’t see big geologic effects there, we probably won’t see them due to climate. A bigger effect might be due to isostatic rebound as glaciers melt.

    Comment by Ray Ladbury — 11 Feb 2008 @ 12:13 PM

  127. I have just found the NASA/GISS temperature anomaly for January 2008, and it is 0.31 C; the lowest monthly anomaly in the 21st century (assuming 2001 is the first year). Do you know Gavin, if what looks like a significant drop since December 2007, has anything to do with colder temperatures in the Arctic?

    Comment by Jim Cripwell — 11 Feb 2008 @ 12:26 PM

  128. Jim Cripwell #127:

    Do you know Gavin, if what looks like a significant drop since December 2007, has anything to do with colder temperatures in the Arctic?

    Well, I’m not Gavin, but a one month drop is not “significant” in any climatological sense. And I would suggest that the cool anomaly has a lot more to do with the current intense La Nina than with Arctic temps (see NOAA SSTs – the big blue bit is a very large cool anomaly in the Pacific).

    Comment by Gareth — 11 Feb 2008 @ 2:33 PM

  129. Temperature, at least just yet, is less of a concern than the on-going ocean acidification. IMO.

    Comment by David B. Benson — 11 Feb 2008 @ 2:46 PM

  130. Here is more on the current La Nina:

    http://biopact.com/2008/02/wmo-la-nia-conditions-strengthen.html

    Comment by David B. Benson — 11 Feb 2008 @ 4:41 PM

  131. Hello again. Back in July 2007 I posted on the subject of 3 papers I found interesting: Green & Armstrong, Lockwood & Frohlich, and Archibald. One respondent recorded several criticisms of the Archibald paper, so I decided to study it more closely. As a result, I have produced my own article on a temperature model which combines both CO2 and solar effects, which you can find here. I am now in a position to comment on the aforesaid criticisms, as follows.

    - Instead of using the world wide temperature, from say GISS or Hadley, he choses a total of five stations. Yes five, all in the within several hundred kilometers from each other in the South Eastern United States.

    Yes, it’s a good point – the only global series he uses is a satellite record for 28 years. If he wanted to use rural temperatures he should have been able to find a larger set.

    - The stations chosen buck the trend of increasing temperatures in the later half of the twentieth century. The stations chosen indicate lower temperatures in the later part of the twentieth century, which is not reflected in the vast majority of stations worldwide. Since so few met stations were chosen, one has to wonder if they were chosen so that they would fit Archibald’s argument.

    Agreed – though as a later comment points out, these are not actually used in the solar cycle analysis, so their relevance is limited regarding the main thrust of the paper.

    - In order to predict the temperature response to the changing solar cycle length, he uses a single temperature station, (De Bilt in the Netherlands). Not even 5 stations. A single station.

    That is incorrect (in the Lavoisier Society June 29 2007 version I am looking at). Armagh is also used, and indeed the Cycle 22 / Cycle 23 comparison is graphed against the Armagh data. But that is still only a single station, even if it is likely to be a proxy for a good chunk of Atlantic Ocean given its location. However, use of a single station is not, in this case, evidence of cherry-picking, because there are so few stations to choose from with venerable records.
    A more serious error, which I note in my paper, is that the graph Archibald quotes is based on a 1+2+2+2+1 year filter applied to the cycle lengths. This means that a new, long, cycle length can only have 1/8 of the effect which he claims (followed by a further 1/8 next cycle when it shifts under the ’2′ coefficient). So instead of
    0.5*(12-9.6) = 1.2
    we would get 0.15, and properly the full 5 cycle filter should be used.

    - He then decides that the correlation between De Bilt and the solar cycle length is good (but fails to mention the R^2 value, because the correlation is poor). He also uses cherry picks data from the complete set of temperature records from the station. This is misleading and wrong. When the full data set is used, the R^2 value
    for the correlation is only 0.0177.

    I cannot comment on that, as I have not looked at the De Bilt data. It is possible that the best filter (I favour 1+1+1) has not been applied.

    - Archibald then predicts a reduction in temperature of 1.5C over the next solar cycle. He claims he can do this due to correlation with cycle amplitude, but then presents a graph of solar cycle length to illustrate the correlation. He offers no explaination. He also has used very strange predictions of the solar cycle.

    The figures I see in Archibald are 1.2C for a 12-year cycle and 1.6C for a 13-year cycle, definitely based on cycle length not amplitude. As before, because of the filter effect, they should be divided by about 8. Now that we have seen the first sunspot of the new cycle, about 2 years later than expected, Archibald’s estimates for the length of Cycle 23 are looking quite credible.

    To conclude, despite the criticisms here of Archibald’s choice of datasets, my article shows that the solar cycle length signal does shine through the global Had CRUT3 data too. But to make best sense of it a gradual trend, probably induced by CO2, needs to be included, and it corresponds to a CO2 doubling sensitivity of 1.4C, well below IPCC estimates. The prediction for mean HadCRUT3 for 2008.0-2019.0 is a modest fall, of 0.15C, from the 1995.0-2006.0 value, followed by further CO2-induced rises.

    Here is the summary of the paper, in case you prefer not to follow the link.

    This article presents new research in which a model for temperature linearly combines an effect from the lengths of the 3 preceding solar cycles and an effect from carbon dioxide atmospheric concentration. It is applied to about 150 years’ worth of “Armagh” data and of “HadCRUT3″ data. In the latter case, two variations of the model (one with CO2 effects delayed by one solar cycle) perform similarly on cycles 10 to 22, and yet give quite different CO2 doubling sensitivities – 1.18C and 1.45C respectively. The CO2 effect is statistically confounded with time, so any Long Term Persistence or emergence from the Little Ice Age would imply overestimation of this parameter. The unbiassed standard residual errors are about 0.070C, equating to impressive R^2 values of 0.87, and the estimate of solar cycle length sensitivity is 0.05C for each year of variability (with accumulation of this over 3 cycles). The flat period between Cycle 17 (midpoint 1937) and Cycle 20 (midpoint 1968), which can be a difficult feature for climate models to explain, is quite well modelled here. But Cycle 23 (using data 1995.0-2006.0) poses a problem for the model, suggesting that neither CO2 nor solar effects are principally responsible for the surface warming recorded in that period. Some speculations are made about this, endorsing the climateaudit.org drive for auditting of these records, whilst allowing for the possibility of an anthropogenic effect of unknown source. Used predictively, these models suggest some modest imminent cooling given that Cycle 23 is turning out to be significantly longer than average.

    Rich.

    Comment by See - owe to Rich — 12 Feb 2008 @ 11:29 AM

  132. Rich,
    I’m afraid I don’t understand some of the motivation for parameters in your model. For instance, why would you delay the contributions of CO2 by a solar cycle? Why do you still insist on a limited dataset? Why do you claim that climate science has trouble with cycles 17-20 or 23? I don’t think these claims can be substantiated if aerosols are included for solar cycle 20. Solar cycle 23 is well within IPCC predictions.

    Comment by Ray Ladbury — 12 Feb 2008 @ 12:25 PM

  133. I shall make a second attempt to reply to Ray’s comment, less controversially I hope.

    1. The delay of CO2 by 1 solar cycle is there because it fits the data better. It accords with the prevalent notion that “there’s some CO2-induced warming in the pipeline”.

    2. I don’t understand what you mean by a “limited dataset”. Isn’t 140-odd years of HadCRUT3 data enough?

    3. Perhaps I should learn more about aerosols in case I want to include them in my model. Could you possibly tell me where solid observational data on these is to be found?

    4. Cycle 23 may well be within IPCC predictions, but will Cycle 24 also be? What is the current IPCC prediction for Cycle 24 (say for the mean 2008.0-2019.0)?

    Rich.

    Comment by See - owe to Rich — 13 Feb 2008 @ 4:42 PM

  134. Rich

    Best summary of the current state of aerosols is in the IPCC summaries and references therein. So, your “1 solar cycle” is in fact an adjustable number with no independent support. You do of course know the quote by von Neumann: “Give me 4 adjustable parameters and I will fit an elephant; five and I will make him wiggle his trunk”.

    By limited data, I mean that you seem to place a lot of weight on one or a few stations.

    As to solar cycle 24, of course a lot depends on what happens. A change in solar output would of course affect temperature, and would not in any affect the validity of the models. Volcanic eruptions, ENSO, etc. are other factors. There is no one prediction, but rather a range, and again this is summarized by the IPCC. All other things being equal, the prediction is for more warming.

    Comment by Ray Ladbury — 14 Feb 2008 @ 10:39 AM

  135. Ray,

    OK, I’ll look for the aerosol stuff in IPCC – but I’m away for a week so it must wait until then.

    I think your quote of von Neumann works to my advantage, as my model only has 3 parameters, whereas I feel sure the GCM models have many more – but any information on that would be welcome.

    On the data front, HadCRUT3, like GISS, is an average of thousands of stations worldwide, and though I analyzed Armagh, most of my comments and predictions refer to the global HadCRUT3.

    Regarding change in solar output and validity of models, I agree that the structure of the models would remain the same. But with more observations to assess the relative sensitivities of climate to solar and CO2 effects (and these do not have to be the same, because of albedo), the parameters in the models may have to change. And as long as sensitivity to CO2 is positive, as most scientists believe, then, yes, all other things being equal the correct prediction would be for more warming.

    But how much? And if Cycle 24 is very weak, then all other things are not equal, and will that outweigh the CO2 effect, and if so for how long and by how much? These are the big questions. The current evidence, based on Jan 07 to Jan 08, suggests a big effect of this slow solar minimum, but there’s the difficulty of correctly allowing for El Nino/La Nina.

    Rich.

    Comment by See - owe to Rich — 14 Feb 2008 @ 4:21 PM

  136. Rich, there is a huge difference between a parameter fixed by independent data and an adjustable parameter determined by optimizing the fit to the data you are assessing. GCMs mainly feature the former, so the agreement they exhibit with the data is highly significant. If solar cycle 24 is weak, then we have to look at how all the forcers change. CO2 forcing is among the best constrained, so it will likely change the least.

    Comment by Ray Ladbury — 14 Feb 2008 @ 7:11 PM

  137. #126Thanks again for your help Ray.
    I do understand the day night range, we are talking about shifting the whole shebang hotter. The day night range and winter summer has had time to settle and form faults over thousands of years.
    My understanding of the interior of the earth is that it’s heated through radioactive decay and left over heat from the accretion that formed earth, this should give an even heat that is slowly cooling and shrinking.
    I found out that thermal stress was the process that formed the plates in the first place, this must still be at play today.
    I could not find an explanation for seismic activities or what is producing the stress that builds up and is released in a quake can you point me in the direction of an explanation?
    I have always thought that these seismic waves were a result of thermal stress and a struggle between different layers of rock will different thermal expansion rates, is this wrong?
    I found a recent article that clams the probability of earthquakes is significantly lower in areas of higher crust temperature.
    Earth’s temperature linked to earthquakes
    http://www.physorg.com/news121524857.html

    Could a cooler crust contain more stress as its temperature has shifted more from its original temperature?
    I have a look at the way plates are moving and it looks to me that the plates on land generally are expanding and pushing together, and plates in the sea or bordering the cooling oceans seem to be shrinking or moving apart.

    Comment by Misanthropic — 15 Feb 2008 @ 4:56 AM

  138. Misanthropic (137) — Any good textbook on geology will explain the situation. Your questions are rather remote from the purposes of a climatology web site.

    Thanks for refraining here.

    Comment by David B. Benson — 15 Feb 2008 @ 12:59 PM

  139. Mis, you postings are full of statements of belief, what “should” or “must” be true — you should, in fact you must, look these things up for yourself.

    When you come here, proclaim your belief, and ask for support, all we can do is help you learn how to check what you believe by looking at the published science.

    Start by questioning what you are sure must be true.
    Particularly when you have no source, things like:

    “it looks to me that the plates on land generally are expanding” — well, you can look this up. I’d recommend you stay away from the “Expanding Earth” sites (there’s a religious group that believes in that); stay with mapping and geology science sites, and you can check your belief.

    Google Scholar, or the reference desk at your public library if you are near one, will help.

    Understanding this isn’t going to help you understand climate change over the scale of decades to centuries.

    Comment by Hank Roberts — 15 Feb 2008 @ 3:39 PM

  140. The most significant climate events over the past few thousand years would seem to be the ice ages and “Minimums”. Would back-testing of the most prominent IPCC models simulate these events ?

    Comment by DuWayne — 16 Feb 2008 @ 12:42 AM

  141. The reference to improved synchronization in the modeling studies brought to mind the work of Stephen Lansing of the University of Arizona and Santa Fe Institute. His research in the 1980s showed that rice farmers in Bali synchronized their planting and harvest cycles to maximize the benefits of water distribution while minimizing crop losses from pests, all coordinated through the religious system of water temples and ceremonies.

    While at the UN climate conference in Bali last December, I wrote a piece in the Jakarta Post that fortuitously ran on the dramatic final day of the meetings, summarizing Lansing’s research and suggesting that decentralized, local knowledge is crucial to climate response.

    But even with a long article space I couldn’t put in everything, and one part was Lansing’s more recent thinking about how these kinds of human cultural, technological or scientific cycles draw from natural synchrony. Below is an excerpt from his article on “Complex Adaptive Systems,” Annu. Rev. Anthropol. 2003. 32:183–204, where he ties this theme directly back to climate change. My feeling is that climate modeling synchrony and distributing climate data could well be a key driver in a broader adaptive management strategy as our society struggles to reduce our climate forcing and the disruption of natural synchrony that sustains biodiversity, resilience and ecosystem services:

    In the Balinese case, global control of terrace ecology emerges as
    local actors strike a balance between two opposing constraints:
    water stress from inadequate irrigation flow and damage from rice
    pests such as rats and insects. In our computer model, the
    solution involves finding the right scale of reproductive
    synchrony, a solution that emerges from innumerable local
    interactions. This system was deliberately disrupted by
    agricultural planners during the Green Revolution in the 1970s.
    For planners unfamiliar with the notion of self-organizing
    systems, the relationship between watershed-scale synchrony, pest
    control, and irrigation management was obscure. Our simulation
    models helped to clarify the functional role of water temples,
    and, partly as a consequence, the Asian Development Bank dropped
    its opposition to the bottom-up control methods of the subaks,
    noting that “the cost of the lack of appreciation of the merits of
    the traditional regime has been high” (Lansing 1991, pp. 124–25).

    An intriguing parallel to the Balinese example has recently been
    proposed by ecologist Lisa Curran (1999). Forty years ago Borneo
    was covered with the world’s oldest and most diverse tropical
    forests. Curran observes that during the El Ni˜no Southern
    Oscillation (ENSO), the dominant canopy timber trees
    (Dipterocarpaceae) of the lowland forests synchronize seed
    production and seedling recruitment. As in the Balinese case,
    reproductive success involves countless local level trade-offs, in
    this case between competition among seedlings versus predator
    satiation. The outcome of these trade-offs is global-scale
    synchronized reproductive cycles. But forest management policies
    have failed to take into account this vast self-organizing system
    (Curran et al. 1999). As Curran explains, “With increasing forest
    conversion and fragmentation, ENSO, the great forest regenerator,
    has become a destructive regional phenomenon, triggering droughts
    and wildfires with increasing frequency and intensity, disrupting
    dipterocarp fruiting, wildlife and rural livelihoods” (p. 2188).
    As a consequence, the lowland tropical forests of Borneo are
    threatened with imminent ecological collapse (L.M. Curran,
    personal communication).

    http://www.ic.arizona.edu/~lansing/CompAdSys.pdf

    Comment by Fred Heutte — 16 Feb 2008 @ 11:13 PM

  142. Re #140 DuWayne: google for “palaeoclimate modelling”.

    Comment by Martin Vermeer — 18 Feb 2008 @ 7:39 AM

  143. Re: 87 Bill Nicholls Says:
    1. Nowhere was the actual size of data sets specified, except for ‘very large’ and ‘Terrabytes’. I’d like to know the size of an individual data set, group of same model sets, and whole ensemble of open access sets.

    The entire AR4 collection hosted at PCMDI is approximately 30 TB of data. Individual netCDF files are no larger than 2GB in size, to respect those folks stuck with 32-bit netCDF-3 libraries. Given that “data set” is a rather fuzzy term (does that mean a single netCDF file, all the 20C3M atmospheric monthly mean netCDF files from a single model, a single realization, and so on) as well as the different geographic resolutions used by the different modeling groups, as well as other factors, I can’t really answer your questions.

    2. The reason for question one is that a number of people, myself included, have > 1T local storage and non-trivial amounts of compute capacity, and are already involved in climateprediction.net, about 50,000 active hosts currently. What is the possibility that this cross-model analysis could be written to run under BOINC control and distributed to many individual systems for a substantial increase in run capacity?

    As mentioned earlier, there may be restrictions imposed by non-US modeling groups regarding distribution of their data by 3rd parties. Additionally, reasonably good metrics on data accesses are needed to report back to funding agencies, to gain more funding, and distribution via 3rd parties makes that more difficult.

    3. Have you considered asking Amazon or Google to host the open part of the datasets for no charge to assist in this process?

    That would be messy, IMHO.

    Comment by Gary Strand — 18 Feb 2008 @ 11:48 PM

  144. I have difficulties to understand some of the reasoning about water vapour being a positive feedback, and still there being an equilibrium in the future.

    I understand that higher temperatures mean higher evaporation and higher specific humidity (without higher relative humidity due to the temperature increase). Because water vapour is a greenhouse gas (an important one), this leads to more greenhouse effect and higher temperatures and higher water vapour and on and on.

    However I don’t understand the part where it is said “until a new equilibrium is found”. Why would it stop at all? I mean, what will be the cause for, at a certain point, more humidity not to increase the greenhouse effect, or the increase of the greenhouse effect not to increase the temperature, or the increase of temperature not to increase the humidity? How does the cycle break? How is the new equilibrium found?

    Some people have written that at some point the increased evaporation won’t mean increased humidity, thanks to increased rains. Then more temperature wouldn’t mean more water vapour in the atmosphere and the cycle would break. However, rain means clouds, and some other people are defending that increased humidity levels won’t lead to increased cloudy areas. Furthermore, it has not happened so far with the increase we have already experienced in both temperature and humidity levels: there is no increase in cloudiness.

    So, what could be exactly the cause for the would-be new-found equilibrium? If it is because of rains, will it affect the clouds? How? When?

    Thanks,
    Nylo.

    [Response: You get to a new equilibrium because there are bounding effects - principally the long wave radiation out to space goes like T^4, which means that it eventually goes up faster than the increased impact of water vapour. See here for a little more explanation. - gavin]

    Comment by Nylo — 21 Feb 2008 @ 1:40 PM

  145. Thanks a lot for your response gavin, it was very illustrating and I could understand it much better. It is not complete equilibrium but the increases in T become too little to be noticed. So it can be called equilibrium.

    However this thing has brought to me some other doubts. After realising that the emissivity changes with T^4, I wondered more about emissivity and found out, in a previous text from you in this website,

    http://www.realclimate.org/index.php/archives/2007/04/learning-from-a-simple-model/

    when talking about Earth’s emissivity you wrote: “If you want to put some vaguely realistic numbers to it, then with S=240 W/m2 and \lambda=0.769, you get a ground temperature of 288 K – roughly corresponding to Earth. So far, so good”.

    You were commenting on the formula that can be seen in this link (sorry, I cannot post images):
    http://www.realclimate.org/latexrender/pictures/ee4d2bead06b402bfc7a5da07763b86f_6.19841pt.png

    The lambda value can be easily measured, I guess, experimentaly, but to me it seems that the S=240W/m2 value is only a chosen value that properly gives the well-known Earth average temperature of 288ºK. Please correct me if I am wrong. So S is not the data but the result of the other data which can be measured, and the formula.

    Why do I think this?

    Because it is incorrect to think that the Earth emmits as a blackbody with a temperature which is equal to the Earth’s average temperature. Because the Earth’s average temperature is the mean of a very variable range of temperatures. Let’s put it this way: a blackbody whose temperature is 268ºK half of the time, and 308ºK the other half, doesn’t emmit in average the same ammount of Watts as a blackbody of 288ºK, the average temperature. It emmits quite more. In fact, it emmits like a blackbody of constant 290ºK. This is because of the dependance of T^4, which means that higher temperatures weight a lot more in the average of the emissivity.

    What does this mean? It means that if, like I guess, they are using the S=240W/m2 as a result of the formula for a blackbody of constant temperature equal to the earth’s average temperature, they are doing it wrong. Because the earth’s temperature is far from constant and it varies a lot both in time and space. Averaging it is wrong. You should integrate the formula for the blackbody emissivity in time and space, because of the Earth’s temperature variability with those 2 parameters, to get the real value of the emissivity.

    I did some rough calculations and the result I would expect would be at least about a 2-3% higher than those 240W/m2. So it would be for me very important to know if the value of 240W/m2 is something that has actually been measured OR if it has been obtained from the rest of the data (Tav and lambda), so that the whole thing keeps consistency. A 3% of 240 W/m2 is a big deal of watts per square metter. Bigger than what is attributed to CO2 increase, for example. It would force to recalculate much of the Earth’s energy balance.

    Thanks a lot.

    PS: Sorry if my english is not very good.

    Comment by Nylo — 25 Feb 2008 @ 11:11 AM

  146. My apologises. I got it all wrong by confusing S (solar radiance) and G (earth’s emissivity) in the formula. S can be measured, no doubt. Therefore either there is a mistake in the calculation of lambda, or the Earth only emmits a big fraction of what a true blackbody would. It is most likely the second.

    Thanks.

    Comment by Nylo — 25 Feb 2008 @ 12:30 PM

  147. Gavin wrote:
    “The other way to reduce download times is to make sure that you only download what is wanted. If you only want a time series of global mean temperatures, you shouldn’t need to download the two-dimensional field and create your own averages. Thus for many purposes, automatic global, zonal-mean or vertical averaging would have saved an enormous amount of time.”

    “A better model would be for the archive to host the analysis scripts as well so that they could be accessed as easily as the data. There are of course issues of citation with such an idea, but it needn’t be insuperable. In a similar way, how many times did different people calculate the NAO or Niño 3.4 indices in the models? Having some organised user-generated content could have saved a lot of time there.”

    A lot of this was discussed by folks at PCMDI and elsewhere when the archive was being planned. There are practical issues here:
    1. As you know, calculating even a simple thing such as global average can get involved. Do you include land / ocean / any other masking (say where obs are available) etc. etc. It gets more complex if you get into anomalies – what period is your climatology based on etc. Indices such as SOI or NAO (where there are multiple methods of computing the index) are more complicated with many “choices” to be made.
    My point is – it is not just a write one script -> let it run -> serve up the data process that every analyst would be happy with.

    2. As for serving up analysis scripts – surely you know there are issues with distributing software: which language – Fortran/C/C++/R/S/Python/Ferret/Grads/GMT? Which version of your code? Do you support it? Can you port it for Windows Vista or AIXx.y or Ubuntu? etc. etc. and more fundamentally – can you guarantee that there is no security threat or virus/trojan horse if I download it and run it?

    3. Another fundamental cultural issue is that many scientists would rather have control over the way the analysis is done than download some “unpublished” (in the peer-review sense) analysis. Heaven forbid you had to retract a paper because you used someone else’s data or code and it had a flaw!

    An idea that does have legs (but technology is not quite there yet!) is whether there is a way to let the user browse through – and slice the data and perform some simple analyses themselves (on the server side) before downloading just the data they need. It has not yet been done because (among other reasons) there needs to be user authentication and also knowing (and being able to control) what load on the data server said analysis will be putting. This is being addressed by many folks and it does not seem very far away but will still suffer because of point 3 above .

    [Response: Hi Krishna, Thanks for the comments. I think I can add a few lines. Firstly, there is an unlimited number of possible analyses. just because you could do the global mean in a few different ways, that shouldn't prevent you from doing one specific way by default. All you are doing is adding something, but if someone wants to do differently they can - nothing has been prevented. Storing scripts (in whatever language) is similarly additive - you don't need to use them. And ask yourself whether it is more likely that a problem will be found more quickly if a script is in an archive or not? People are using the data from many sources with the expectation that it is correct, but with the knowledge that it might be flawed (MSU temperatures, ARGO floats etc.) - that is not unique to model analysis and does not undermine anything. I think if such as system were set up, it would function much better than some may expect. - gavin]

    Comment by Krishna AchutaRao — 3 Mar 2008 @ 4:40 AM

  148. Concerning the issue of variance in weather data that is used for input into climate model and the problem of lack of correlation between them I am wondering if consideration has been given to the construction of fixed weather data models similar to what geographers use in constructing maps. Any map starts with a mathematical datum construct called an ellipsoid. An ellipsoid is a smoothed representation of the shape of the earth. All the coordinates of physical objects are projected onto this ellipsoid and then this is used as the basis for the various projections (mercator, conic, etc) that result in a flat map. There are dozens of different standard ellipsoids (NAD27, NAD83…) perhaps hundreds. Each has different strengths and weakness which are known. Some ellisoids or datums are better depending on the final use or geographical scope of the final map. Once a final map projection is produced the underlying datum can be mathematically switched and the different results compared. I am wondering if some system of standard weather datums might not be produced that could be easily intercorrelated and referenced when running models?

    Thank you,
    Luis

    Comment by Luis Watts — 3 Mar 2008 @ 11:34 AM

  149. Re 147

    1. As you know, calculating even a simple thing such as global average can get involved.

    As a potential data consumer, I’d point out that some useful reductions are not problematic – regional subsets for example would be assumption-free.

    3. Another fundamental cultural issue is that many scientists would rather have control over the way the analysis is done than download some “unpublished” (in the peer-review sense) analysis.

    Don’t underestimate the appetite for this data among nonscientists. That will create its own set of cultural issues (e.g. emergence of the model equivalent of surfacestations.org), but on the whole should be positive.

    Tom

    Comment by Tom Fiddaman — 5 Mar 2008 @ 7:28 PM

  150. We have been asked to join a review panel consultation for the development of a specification for the assessment of the life cycle greenhouse gas emissions of goods and services in the UK. There are some points I wish to raise, which may be helped by Global Climate Models.

    Two examples:

    1. When 70 tonnes of CO2 is released into the atmosphere when a new flat is built there is a convention that this has a “similar” effect on the atmosphere as releasing one tonne per year for 70 years, the assumed lifetime of the flat. Is this sensible?

    2. When assessing the Global Warming Potential of beef should the methane in the life cycle assessment be measured over 20, 100 or 500 years?

    A more general question, that might help in understanding these issues might be: “If one gigatonne of CO2 is released into the atmosphere now, how much CO2 must be extracted in 20 years time to counteract the effect of the initial release.”

    I suspect this question is not precise enough. Has anybody help with better ones?

    Comment by Geoff Beacon — 6 Mar 2008 @ 11:01 AM

  151. Geoff Beacon (150) — I am an amateur here, but enough of one to provide fairly good answers to your three questions. (I hope others will pitch in as well.)

    1. Yes. CO2 is slow and persistent.

    2. Methane has a short-life time in the atmosphere, about 20 years should do it.

    3. All the fossil fuel derived CO2 released into the atmosphere needs to be removed, the sooner the better. That said, 20 years ought to be soon enough, although we don’t actually know all the damages done in that short time.

    Comment by David B. Benson — 6 Mar 2008 @ 5:41 PM

  152. Geoff, definitely don’t rely on us amateurs for final answers.

    Note for example that the lifetime of methane in the atmosphere doesn’t mean that it will disappear, it oxidizes, and the result is

    CH4 + 2O2 -» CO2 + 2H2O

    Comment by Hank Roberts — 6 Mar 2008 @ 6:54 PM

  153. The answer to my question (150) might be changed by the size of feedbacks in the climate system. There have been several reports of positive feedbacks: failing carbon sinks, loss of soil carbon, methane from wetlands/tundra, more forest fires, the drying of the Amazon, the sea ice albedo effect, & etc. There may also be anthropogenic feedbacks: turning up the air-conditioning in response to a warmer climate … but also turning down the heating.

    Are these feedbacks too small to change the answer? If positive feedbacks exceed negative feedbacks, what extra CO2 must be removed at the end of the chosen period to reverse the net effect of these feedbacks?

    Comment by Geoff Beacon — 7 Mar 2008 @ 2:55 AM

  154. Geoff Beacon (153) — The climate is most unlikely to be a reversable process. Surely there is a good deal of hysteresis:

    http://en.wikipedia.org/wiki/Hysteresis

    The best simple answer I can give is to permanently sequester, as soon as possible, at least 350 GtC and stop adding (or else immediately sequester) the currently about 8.5 GtC being added yearly to the active carbon cycle. Then continual monitoring and scientific advances will discover what additional remediation is required.

    Comment by David B. Benson — 7 Mar 2008 @ 1:22 PM

  155. Re 150

    I’d argue that 70 tons today is significantly worse than 70 tons over 70 years. The atmospheric half-life of carbon is long, but not infinite – today it might be 120 years; with feedbacks and sink saturation that might double, but 70 would still be a significant fraction.

    You also have to consider the economic effect – the 70 tons over 70 years effectively delays the damage, and even if you think the discount rate on welfare should be zero, there’s still significant benefit from that due to time value of money.

    Either way, the analysis has to extend to temperature effects (or better yet, impacts), not just atmospheric GHGs. “If one gigatonne of CO2 is released into the atmosphere now, how much CO2 must be extracted in 20 years time to counteract the effect of the initial release.” is a good question, as long as you mean achieving an equivalent welfare trajectory, not just an equivalent CO2 trajectory.

    I played around with a simple integrated model to see what would happen. The results aren’t quite what I expected, but I think make sense. Essentially, I tried injecting a 100 GT pulse of carbon into the atmosphere over 1 or 70 years, starting in 2000 and running the model to 2200, with and without a modest temperature feedback to the carbon cycle.

    Without temperature feedback to the carbon cycle:

    For CO2 concentration, the 70yr pulse actually results in a higher concentration any time after about 2060, simply because the earth hasn’t had time to squirrel away the emissions from later years. However, the difference is not great. Either way, 75% of the stuff is still around in 2200 (not what you’d expect from average lifetimes, but I think reasonable due to diminishing marginal uptake).

    For temperature, the picture is different. The 1yr pulse drives temperature about .16C above the no-pulse trajectory, with the effect peaking in 2040, and falling to about .06C by 2200. The 70yr pulse takes much longer to reach peak effect, a maximum of .11C in 2080, with about the same effect in 2200.

    In welfare terms (using fair discounting), the 1yr pulse is 57% worse (greater loss) than the 70yr pulse.

    With temperature feedback (modest – enough to raise BAU CO2 by 20% in 2100):

    The 1yr pulse triggers additional feedback emissions, such that in 2200 effectively 100% of the pulse is still around. The 70yr pulse still takes 60 yrs to reach the same level, but things end up in roughly a tie.

    The temperature increase from the 1yr pulse is somewhat greater, peaking a little later at .17C. The 70yr trajectory never quite catches up.

    The welfare effect is now 67% worse.

    As a wild guess, this suggests that you could use a discount rate of 1 to 2% per year to convert between emissions today vs. emissions distributed over the future, without worrying too much about the feedbacks. That’s barely better than a wild guess and all the numbers above should be taken with a huge grain of salt. It ought to be possible to narrow things down without too much trouble.

    Tom

    Comment by Tom Fiddaman — 7 Mar 2008 @ 1:31 PM

  156. Tom (155). That’s very helpful. One immediate question arises.

    I understand your point that the extended release delays the damage. I am happy, for the present, to assume a zero discount rate on welfare. But I would like to understand more about the “time value of money”. Is this a discount rate that has the rate of technological change as a component?

    Let me stick to the example of building homes. Advances in construction will allow future homes to be built that have less embodied CO2 at similar cost. Consider a house of brick construction that is set back for five years so that it can be made from hemp and lime. Let us assume that this reduces the embodied CO2 from 70 tonnes to -10 tonnes … I have seen such claims. This would mean a saving of 80 tonnes of CO2 by delaying five years.

    Discount rates can be a measure of the benefits from delaying expenditure. Would the “time value of money” account for the benefit from the delay, the “saved CO2″? If not, where can the “saved CO2″ get into the discussion?

    Comment by Geoff Beacon — 7 Mar 2008 @ 6:01 PM

  157. Luis writes:

    [[Concerning the issue of variance in weather data that is used for input into climate model and the problem of lack of correlation between them I am wondering if consideration has been given to the construction of fixed weather data models similar to what geographers use in constructing maps.]]

    Weather data is not used for input into climate models.

    Comment by Barton Paul Levenson — 8 Mar 2008 @ 7:27 AM

  158. Here is an excerpt of a note I sent to my Euro MP last year:

    I am concerned that the European Union’s assessment of climate change may be compromised by an over-emphasis on “official science” and consequently on the current batch of computer models. Loss of arctic sea ice is not the worst consequence of climate change but it is an indicator of the amount by which “official science” underestimates reality. Predictions of the year in which the arctic loses its sea ice in summer, “Arctic Ice Zero”, are tests of the accuracy of current predictive models;

    BBC: 15 May, 2002 … Arctic Ice Zero in 2080
    http://news.bbc.co.uk/2/hi/science/nature/1989707.stm

    BBC: 28 September 2005 … Arctic Ice Zero in 2060
    http://news.bbc.co.uk/2/hi/science/nature/4290340.stm

    BBC: 12 December 2006, … Arctic Ice Zero in 2040
    http://news.bbc.co.uk/2/hi/science/nature/6171053.stm

    Now some are saying … Arctic Ice Zero in 2020
    http://www.reuters.com/article/environmentNews/idUSN0122477020070501

    But this year the IPCC predicted Arctic Ice Zero in 2050.
    Does this confirm that IPCC science is “two years out-of-date”?

    Does the European Union have mechanisms for updating “official science”?

    Yesterday, 12 December 2007, there is a report on the BBC website “Arctic summers ice-free ‘by 2013′ “. (http://news.bbc.co.uk/2/hi/science/nature/7139797.stm)
    So does the European Union have proper mechanisms for updating “official science”? Does the UK Government? Does anybody else?

    Please tell me I’m panicking!

    Is this a meta-analysis?

    Comment by Geoff Beacon — 11 Mar 2008 @ 11:54 AM

  159. Physical climatologist Eric DeWeaver comments on climate models in

    http://www.sciencedaily.com/releases/2008/03/080311163631.htm

    entitled Arctic Climate Models Playing Key Role In Polar Bear Decision.

    Comment by David B. Benson — 12 Mar 2008 @ 1:22 PM

  160. David (155), Thank you.

    I have skimmed the papers associated with the item you report. The main one “Uncertainty in Climate Model Projections of Arctic Sea Ice Decline: An Evaluation Relevant to Polar Bears”. This does consider feedbacks associated with the albedo effect of sea and ice and cloud cover in the Arctic but not the other possible feedbacks. It is concerned with the Arctic sea ice extent and seems to take as a given the other aspects of general models.

    I was wondering if the predicted extent of sea-ice could act as a test of these models. Have the models missed important temperature related-feedbacks that are just beginning to have their effects because the Earth’s temperature is just beginning to rise.

    Have the models properly taken account of … positive feedbacks: failing carbon sinks, loss of soil carbon, methane from wetlands/tundra, more forest fires, the drying of the Amazon, the sea ice albedo effect, & etc.

    I was surprised to hear that the models used for the recent IPCC reports did not include feedbacks for methane from melting tundra. This feedback may or may not be important but are other feedbacks missing?

    If some models miss out some feedbacks, how is the whole ensemble changed?

    Are the “unknown unknowns” likely to be positive or negative ones?

    Comment by Geoff Beacon — 12 Mar 2008 @ 6:28 PM

  161. Geoff Beacon (160) — I don’t know the answers to your questions. The Wikipedia page

    http://en.wikipedia.org/wiki/Climate_model

    seems a good place for you to start, with many links to explore, including the one to the Wikipedia page on global climate models and then the links that page contains.

    Comment by David B. Benson — 13 Mar 2008 @ 12:38 PM

  162. David (160) thanks again.

    I can’t see myself becoming a real climate scientist with a credible message for politicians, press or public. But I would like to pass one on.

    I heard on the BBC news 24 today that glaciers are shrinking twice as fast as previously thought (the 9th of 10 items and lasting 9 seconds). Is this increased rate of melting a surprise? Does it indicate that “we’ve underestimated sensitivity or what carbon cycle feedbacks could do”.

    The best we get from politicians is a grudging acceptance of IPCC AR4. Most journalists are worse, especially the BBC. Stephen Sackur interviewed Al Gore and Rajendra Pachauri a few months ago. Instead of challenging the them with James Lovelock or James Hansen he challenged them with Bjorn Lomborg, an economist who uses a mid-range prediction from IPCC AR4 of 2.6 degC global temperature rise by 2100 to argue we can afford to wait.

    If climate scientists have underestimated climate change then please let the rest of us know. So we tell the economists, politicians, journalists and government officials the bad news.

    Comment by Geoff Beacon — 16 Mar 2008 @ 1:04 PM

  163. Geoff Beacon (162) — I’m but an amateur regarding the very difficult subject of climatology. By now a moderately knowledgeable amateur, based upon a lifetime amateur interest in paleogeology. That said, in my opinion, IPCC has indeed underestimated climate change. From the IPCC 2001 TAR, the linked commentary points out that at that time they underestimated the temperature trend:

    http://www.skepticalscience.com/Comparing-IPCC-projections-to-observations.htm

    THe IPCC AR4 is linked on the sidebar. They explicitly state that they have left out estiamtes for glacier and icecap melting, basically because so little is known. The projections of Arctic sea ice melting are off by decades (although I haven’t read that section of AR4, just commentary). Furthermore, AR4 states right out that the climate models do not do well in predicting precipitation patterns.

    However, climatatology is a rapidly developing subject: AR4 was obsolescent even before it was finished. Some climatologists are estimating larger climate changes than the (necessarily conservative) consensus position of IPCC.

    I agree with James Hansen that not only must we stop adding carbon to the active carbon cycle, but that lots of carbon needs to be put back underground, securely and permanently. He suggests about 300-350 ppm, I believe. I suggest 315 ppm, largely because this was the concentration in the 1950s, not that I think it anything more than important interim goal.

    Finally, I’ll direct you to

    http://climateprogress.org/

    also linked on the sidebar, as a sensible blog to follow regarding the ‘bad news’ and ‘what to do about it’.

    Comment by David B. Benson — 18 Mar 2008 @ 2:05 PM

  164. David

    Thank you. That has led me to find web pages I didn’t know about, particularly James Hansen’s page at Columbia. http://www.columbia.edu/~jeh1/

    I don’t know how I missed it before. It answers several questions – but perhaps not all of them yet.

    Comment by Geoff Beacon — 19 Mar 2008 @ 10:26 AM

  165. A 101 question from a non-scientist.

    From the AR4SPM

    Eleven of the last twelve years (1995 -2006) rank among the 12 warmest years in the instrumental record of global surface temperature (since 1850). The updated 100-year linear trend (1906–2005) of 0.74 [0.56 to 0.92]°C is therefore larger than the corresponding trend for 1901-2000 given in the TAR of 0.6 [0.4 to 0.8]°C.

    (Chapter) 3.2 is referenced, and while various temperature records are referred to there, these figures appear to be from the HadCRU series.

    The warming trend for 2001 – 2006 seems to be less pronounced than the cooling period 1901 – 1905.

    As the cooling period 1901 – 1905 was omitted in the AR4 series, I wonder about the use of the word “therefore”. It has been suggested to me that the increased trend could be an artefact of the 5 year shift between reports.

    That is, if the 1901 – 1905 cooling is removed, might that not “therefore” account (in part or whole) for the increased trend, rather than the “the 12 warmest years in the instrumental record”? (I’m no statistician and am unable to resolve this myself)

    My questions are;

    1) Are the figures in the SPM derived from HadCRU alone, or from a combination of that and other series (NCDC, GISS)?

    2) Is the increased trend in any way an artefact of changing the period by 5 years (and if so, is the use of “therefore” in the SPM statement misleading)?

    Though I am unable, my co-interlocutor may understand a sheer statistical response.

    Comment by barry — 26 Jun 2008 @ 3:08 AM

  166. Barry, as the warming from 1995-2006 was part of a 30 year warming trend, I rather doubt that that start and end points affect the qualitative conclusion. Also, although the different temperature records use slightly different algorithms and so have slightly different values for a given year, the trends they show are consistent. Therefore, while magnitudes may vary slightly, it is doubtful that the important conclusion–we’re getting warmer–would be altered.

    Comment by Ray Ladbury — 26 Jun 2008 @ 9:00 AM

  167. something that the oil companies don’t want you to know…. ask the moderator for the website address. This is not a sales pitch, but an opportunity & reality.

    Comment by Alvin Claiborne — 29 Mar 2009 @ 4:31 PM

  168. Could someone please help me? I am trying to download fairly high-resolution versions of GCMs showing future precipitation in South America.

    I’d like to contrast several different models showing expected changes in precip for the latter part of this century, under the IPCC high-GHG scenario.

    Please let me know where I can find this (laurancew@si.edu). Many thanks.

    [Response: PCMDI IPCC AR4 archive. There is an automatic registration process, but everything you need should be there. - gavin]

    Comment by William Laurance — 17 May 2009 @ 4:49 AM

  169. re 140, yes they would be useful.

    And they are being done.

    And the results are appropriate action from noted impetus from the climate models.

    Comment by Mark — 17 May 2009 @ 8:29 AM

Sorry, the comment form is closed at this time.

Close this window.

0.555 Powered by WordPress