• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

RealClimate

Climate science from climate scientists...

  • Start here
  • Model-Observation Comparisons
  • Miscellaneous Climate Graphics
  • Surface temperature graphics
You are here: Home / Climate Science / The CRU hack: Context

The CRU hack: Context

23 Nov 2009 by Gavin

This is a continuation of the last thread which is getting a little unwieldy. The emails cover a 13 year period in which many things happened, and very few people are up to speed on some of the long-buried issues. So to save some time, I’ve pulled a few bits out of the comment thread that shed some light on some of the context which is missing in some of the discussion of various emails.

  • Trenberth: You need to read his recent paper on quantifying the current changes in the Earth’s energy budget to realise why he is concerned about our inability currently to track small year-to-year variations in the radiative fluxes.
  • Wigley: The concern with sea surface temperatures in the 1940s stems from the paper by Thompson et al (2007) which identified a spurious discontinuity in ocean temperatures. The impact of this has not yet been fully corrected for in the HadSST data set, but people still want to assess what impact it might have on any work that used the original data.
  • Climate Research and peer-review: You should read about the issues from the editors (Claire Goodess, Hans von Storch) who resigned because of a breakdown of the peer review process at that journal, that came to light with the particularly egregious (and well-publicised) paper by Soon and Baliunas (2003). The publisher’s assessment is here.

Update: Pulling out some of the common points being raised in the comments.

  • HARRY_read_me.txt. This is a 4 year-long work log of Ian (Harry) Harris who was working to upgrade the documentation, metadata and databases associated with the legacy CRU TS 2.1 product, which is not the same as the HadCRUT data (see Mitchell and Jones, 2003 for details). The CSU TS 3.0 is available now (via ClimateExplorer for instance), and so presumably the database problems got fixed. Anyone who has ever worked on constructing a database from dozens of individual, sometimes contradictory and inconsistently formatted datasets will share his evident frustration with how tedious that can be.
  • “Redefine the peer-reviewed literature!” . Nobody actually gets to do that, and both papers discussed in that comment – McKitrick and Michaels (2004) and Kalnay and Cai (2003) were both cited and discussed in Chapter 2 of 3 the IPCC AR4 report. As an aside, neither has stood the test of time.
  • “Declines” in the MXD record. This decline was hidden written up in Nature in 1998 where the authors suggested not using the post 1960 data. Their actual programs (in IDL script), unsurprisingly warn against using post 1960 data. Added: Note that the ‘hide the decline’ comment was made in 1999 – 10 years ago, and has no connection whatsoever to more recent instrumental records.
  • CRU data accessibility. From the date of the first FOI request to CRU (in 2007), it has been made abundantly clear that the main impediment to releasing the whole CRU archive is the small % of it that was given to CRU on the understanding it wouldn’t be passed on to third parties. Those restrictions are in place because of the originating organisations (the various National Met. Services) around the world and are not CRU’s to break. As of Nov 13, the response to the umpteenth FOI request for the same data met with exactly the same response. This is an unfortunate situation, and pressure should be brought to bear on the National Met Services to release CRU from that obligation. It is not however the fault of CRU. The vast majority of the data in the HadCRU records is publicly available from GHCN (v2.mean.Z).
  • Suggestions that FOI-related material be deleted … are ill-advised even if not carried out. What is and is not responsive and deliverable to an FOI request is however a subject that it is very appropriate to discuss.
  • Fudge factors (update) IDL code in the some of the attached files calculates and applies an artificial ‘fudge factor’ to the MXD proxies to artificially eliminate the ‘divergence pattern’. This was done for a set of experiments reported in this submitted 2004 draft by Osborn and colleagues but which was never published. Section 4.3 explains the rationale very clearly which was to test the sensitivity of the calibration of the MXD proxies should the divergence end up being anthropogenic. It has nothing to do with any temperature record, has not been used in any published reconstruction and is not the source of any hockey stick blade anywhere.

Further update: This comment from Halldór Björnsson of the Icelandic Met. Service goes right to the heart of the accessibility issue:

Re: CRU data accessibility.

National Meteorological Services (NMSs) have different rules on data exchange. The World Meteorological Organization (WMO) organizes the exchange of “basic data”, i.e. data that are needed for weather forecasts. For details on these see WMO resolution number 40 (see http://bit.ly/8jOjX1).

This document acknowledges that WMO member states can place restrictions on the dissemination of data to third parties “for reasons such as national laws or costs of production”. These restrictions are only supposed to apply to commercial use, the research and education community is supposed to have free access to all the data.

Now, for researchers this sounds open and fine. In practice it hasn’t proved to be so.

Most NMSs also can distribute all sorts of data that are classified as “additional data and products”. Restrictions can be placed on these. These special data and products (which can range from regular weather data from a specific station to maps of rain intensity based on satellite and radar data). Many nations do place restrictions on such data (see link for additional data on above WMO-40 webpage for details).

The reasons for restricting access is often commercial, NMSs are often required by law to have substantial income from commercial sources, in other cases it can be for national security reasons, but in many cases (in my experience) the reasons simply seem to be “because we can”.

What has this got to do with CRU? The data that CRU needs for their data base comes from entities that restrict access to much of their data. And even better, since the UK has submitted an exception for additional data, some nations that otherwise would provide data without question will not provide data to the UK. I know this from experience, since my nation (Iceland) did send in such conditions and for years I had problem getting certain data from the US.

The ideal, that all data should be free and open is unfortunately not adhered to by a large portion of the meteorological community. Probably only a small portion of the CRU data is “locked” but the end effect is that all their data becomes closed. It is not their fault, and I am sure that they dislike them as much as any other researcher who has tried to get access to all data from stations in region X in country Y.

These restrictions end up by wasting resources and hurting everyone. The research community (CRU included) and the public are the victims. If you don’t like it, write to you NMSs and urge them to open all their data.

I can update (further) this if there is demand. Please let me know in the comments, which, as always, should be substantive, non-insulting and on topic.

Comments continue here.

Filed Under: Climate Science

About Gavin

Reader Interactions

1074 Responses to "The CRU hack: Context"

Comments pagination

« Previous 1 … 14 15 16 17 18 … 22 Next »
  1. Gnrnr says

    26 Nov 2009 at 5:08 PM

    #659

    “In your field, long-term data series don’t confront instrument calibration issues as technology changes over century timescales?

    I don’t believe this, either. On the other hand, I will believe you if you tell me that in your field, field conditions (which gavin mentions) aren’t a problem, because maybe you’re just sticking a probe up some person’s ass, or are instrumenting some laboratory, rather than working in the field.”

    You mention “which gavin mentions” above. When did he reply to me? I also note that you resort to attacking myself and deriding me for asking a question.

    I have those issue where a certain type of instrument is no longer available and data needs to be compared from one time period to another and corrections or adjustments are made, but guess what, we DO NOT alter the original readings in anyway shape or form. We account for it in the analysis and comment on it clearly. The raw data is the raw data.

    Either the instrument is calibrated at the time of recording or it isn’t. Provided I’m using a calibrated probe it wouldn’t matter if I stuck it up someones arse this year or 20 years ago. If the body temp is the same the probe reading should be the same.

    No point in changing the data after the fact if you didn’t have it calibrated at the time of reading it. If I come back along in 10 years time and then try and calibrate, how can it be possible to know how much each reading should be presicely adjusted by?

    I would expect that data being used for science on this scale is being taken from calibrated equipment. Hence the reading is the reading and does not require readjustment later. Also I would expect that data being released as “RAW data” would not have any corrections added. Hence my original query.

    I understand the with atmospheric temperature readings the sites of some guages have been changed over the years and some adjustment of the readings may be considered on that basis. I presume that you therefore also remove the effects of local heatsinks such as roads, and buildings from the data (hey we are calibrating aren’t we).

  2. Hank Roberts says

    26 Nov 2009 at 5:16 PM

    John, don’t assume everything about climate change relies on “the hockey stick” — whichever one of many different climate charts you’re looking at.

    Lots of things get called “hockey stick” (it’s always helpful if you provide a link or name of the source you’re using so others can look at it).

    Have a look at the various different charts here, for example — this speaks to the question you’re asking about uncertainty:

    http://www.image.ucar.edu/~nychka/manuscripts/hockey.pdf

    If you read each page — a big image and a few lines of text — and get up to the page that says:

    “rate lines of evidence
    Reducing uncertainty depends on the convergence from distinct sources of information”

    you’ll have a much clearer idea what the gray area around the black line means in those various charts, and I think will understand why this is useful information.

    If you haven’t taken Statistics 101, it’s important to realize that there’s a good semester’s classwork, at least, assumed on those few pages. If you haven’t done it, you won’t believe a lot of it. Ask more questions and people can point you to more information appropriate to where you’re starting from, if you want to say.

    Also, if you’re coming here after reading something somewhere else — it always helps to say where you came from and what you read there and if you’re considering it reliable.

  3. Bob says

    26 Nov 2009 at 5:36 PM

    Your response to Simon Abingdon (742). Waiting ten years might be preferable to re-ordering the world economy on less than ideal science.

    Bob

  4. Timothy Chase says

    26 Nov 2009 at 6:04 PM

    Simon Abingdon wrote in 728:

    When I was a computer programmer (of commercial – not scientific – applications) in the 1960s it was gradually realised that use of the “GO TO” instruction led to “spaghetti” code that by dint of its impenetrability was virtually impossible to debug thoroughly. The correct working of such programs could not be assured other than in response to necessarily limited sets of specimen data. See “Go To Statement Considered Harmful” by Edsger W. Dijkstra (Communications of the ACM, Vol. 11, No. 3, March 1968, pp. 147-148).

    A bit on the dated, don’t you think?

    C# includes the goto statement:

    19. Jump Statements

    Not many surprises here, apart from perhaps the infamous goto. However, it’s really a distant relative of the evil goto statement we remember from basic 20 years ago. A goto statement must point to a label or one of the options in a switch statement. The first usage of pointing to a label is similar to the usage of a continue : label statement in Java, except with a little more freedom. Such a goto statement may point anywhere within its scope, which restricts it to the same method, or finally block if it is declared within one. It may not jump into a loop statement which it is not within, and it cannot leave a try block before the enclosing finally block(s) are executed. The continue statement in C# is equivalent to the continue statement in Java, except it can’t point to a label.

    C# Compared to Java and C++
    Ben Albahari
    http://randysolutions.com/csharp_comparative.html

    The use of the Goto-statement is not automatic proof that code is spaghetti or that it can’t be debugged thoroughly. It really depends upon how the goto-statement is used. And you can invoke rules that may even be written into the DNA of a language which prevents goto-statements from being misused.

    No, what makes it practically impossible to thoroughly debug a program is complexity, particularly where there is the potential such interaction between different parts of the program that the possible sequences of events grows super-exponentially. And of course beyond a certain level of complexity (arithmetic) one is running up against Godel’s theorem.

    So one tests a program that encodes a climate model against simpler models. One tests it against the empirical evidence. One compares it against other models And if there are important errors they should stand out. But as in so much of life, there are no iron-clad guarantees. But fortunately most people are able to move beyond the stage of staring unbelievingly at their own hand. Even Descartes — although arguably for the wrong reasons.

  5. Hank Roberts says

    26 Nov 2009 at 6:04 PM

    PS, John, I’m sure someone will come along with a better answer; I just picked that link out of a bunch of likely-looking pages found by doing a Google Image search for “hockey stick” uncertainty — then looking at the sources given to find one at a real science site rather than an opinion blog.

    There are a lot more opinion blog sites in the result than science sites. You’ve got to watch carefully. You can also do a Google Scholar search and get useful information.

    Also, use the “Start Here” link at the top of the page, and use the Search box at top right, along with the first link under Science in the right hand sidebar.

    Uncertainty is difficult; statistics takes some study to grasp.

  6. dhogaza says

    26 Nov 2009 at 6:16 PM

    Your response to Simon Abingdon (742). Waiting ten years might be preferable to re-ordering the world economy on less than ideal science.

    GOTO statements don’t affect the science one way or the other.

    And, of course, the science will never be ideal, so essentially you’re saying – we should never do anything about anything.

  7. Richard Mike says

    26 Nov 2009 at 6:23 PM

    In my opinion the whole issue was that MM should not get access to tha data. They did not fear that the data were flawed. Just that MM would find the tiniest problem and then they would use that as proof that the whole datases is problematic and boast about it in news channels, blogoshpere.

    And yes I totally agree with hiding this information from people like MM since they are not interested in truth, they have an agenda and that is what they care about.

  8. david says

    26 Nov 2009 at 6:24 PM

    Regarding spaghetti code, the vast majority of software of any kind is badly written. And yet it works (more or less). Furthermore, just because code is well written does not mean that it does not hide bugs. There are a couple of ways to check that code actually works :
    (1) test it against known input/output
    (2) Test it against other code that has been written to do the same thing by someone else.
    (3) Test that certain conditions that should never happen don’t actually happen (either using formal proof methods or by test cases)

    These are the methods used by companies such as Intel to produce (usually) bug free microprocessors consisting of hundreds of millions of transistors.

    Not sure about (3), but the (1) and (2) have certainly been done for climate models.

  9. Alasdair Green says

    26 Nov 2009 at 6:25 PM

    Gavin, just to add a tinch of context before my point, 25 years ago I was planting trees and tree seeds on land that was not mine and where I had no permission to be. Anywhere I thought they would grow. I was alone and no pioneer of anything. Just profoundly concerned by the destruction of our world. I am now probably in the same group as Monbiot. Appalled. Onto my questions, if you would be kind enough to take a moment to answer them please.
    1. A comment please: Dendroclimatology is the weakest of the climate proxies, isn’t it? (I’m being kind there). I know a bit about trees, and, as every plant scientist can tell you, tree growth in non-tropical latitudes is regulated by (in descending order) soil moisture content via precipitation or otherwise, temperature in spring (although they have a remarkable ability to catch up for a late spring), followed by random factors such as fire, felling (reduction of competition) etc. There is absolutely no linear relationship between temperature and tree growth. This was simply a supposition by a climatologist who didn’t consult the plant scientists. [edit]

    [Response: No. Tree rings studies do show climate changes and have been replicated many times against independent data. The issue with MXD that seems to be the focus here, is a real issue, and is being studied by many groups. – gavin]

    2. Climate science is in its infancy, is it not? Why is it impossible for all to agree that we simply don’t know exactly why the temperature rose until about 10 years ago, and we don’t know or can’t explain why it hasn’t risen since. But we should all agree that as the factors are unknown, or not understood, it is sensible to apply the precautionary principle as espoused in the Rio Treaty, namely that if it’s possible that we are a malignant element in the climate, we should change our ways. Just like if you come home and smell gas, you don’t start by analysing the cause, you first of all switch it off at the mains just in case.

    [Response: That’s fine as argument if you don’t know anything. But we do know stuff and so we can do better than that. – gavin]

    I could ask many other things, but those two will do for now. Thanks Gavin.

  10. Timothy Chase says

    26 Nov 2009 at 6:37 PM

    SecularAnimist wrote in 741:

    What we are seeing here is the fruit of a generation-long, multi-multi-million dollar media campaign to create an audience of “Ditto-Heads” who have been systematically brainwashed to unquestioningly believe whatever they are told by the so-called “right wing” media and to regard all other sources of information as not only suspect, biased and untrustworthy, but as promulgators of deliberate, malicious, un-American “liberal” lies….

    As much as I might like to argue with you on this point (as someone who still considers himself pro-capitalism but no longer what might be termed “libertarian”), the evidence would appear to fit. The only point that I might argue is that some of those who finance the mentality that you describe may very well fall for their own propaganda.

    Along these lines I would like to call attention to the following:

    Competitive Enterprise Institute to sue RealClimate blogger over moderation policy, comment 19
    November 26, 2009 at 1:49 pm
    http://climateprogress.org/2009/11/25/competitive-enterprise-institute-to-sue-realclimate-blogger-over-moderation-policy/#comment-212305

    … where I outline a possible response by the pro-science community to the personal attacks upon individual climatologists including the hacking of email belonging to climatologists at Hadley CRU for the purpose of a disinformation campaign and the declaration of the intent to launch a lawsuit against Gavin Schmidt of Real Climate. I also give figures for the funding of the Competitve Enterprise Institute by Exxon, Scaife, Bradley, Koch and Coors foundations and, more broadly, funding received by denialist organizations from the Scaife, Bradley, Koch and Coors foundations where the denialist organizations are also part of the Exxon-funded network for attacking climatology.

    In the comments I identify funding sources and touches on the ideology that such funding buys — although I and most certainly others have gone into considerably more detail on the latter. And obviously a great deal more is possible, e.g., identifying various other oil and coal interests that are funding similar organizations, and scientists and organizations that are involved in climate denialism, where involved in denialism with respect to the medical consequences of tobacco use. And given the connections to the Religious Right, some of these foundations were undoubtedly involved in the funding of attacks upon evolutionary biology and even the funding of religious extremism.

    In any case, actual names, dollar figures and years to support your statements. Oh, and Media Matters “Transparency” is back but at a new web address:

    http://mediamattersaction.org/transparency/

    It is where I got most of my dollar amounts.

  11. Hank Roberts says

    26 Nov 2009 at 6:39 PM

    > Waiting ten years might be preferable to re-ordering
    > the world economy on less than ideal science.

    Unfortunately that’s already happened, and the current order isn’t sustainable.

    Are you recommending adoption of the Precautionary Principle for anything that changes the world economy? That would be radical.

  12. VickyS says

    26 Nov 2009 at 7:03 PM

    Further to the comments on data accessibility; many governments dealing with deficits, especially during the 1990s, tried to make their various departments self-funding or even sources of revenue, so the decision to make data a commodity goes above even the National Meteorological Services and directly to national governments. Budget cuts also led to the closure of many weather stations, which led to many of the problems discussed in trying to compile global temperature records, such as calibrating series, sorting out changing WMO station identifiers, and trying to account for changing spatial coverage over time. There were more observing stations in the 1960s and 70s then there are now. This ought to be a scandal; we should be at least maintaining existing weather stations, instead having fewer as time goes on. If you’re going to write letters, write also to your local politicians about closing down the very weather and climate stations which are telling us what is going on!

  13. James C. Wilson says

    26 Nov 2009 at 7:06 PM

    You say: “McKitrick and Michaels (2004) and Kalnay and Cai (2003) were both cited and discussed in Chapter 2 of the IPCC AR4 report.”

    In my version, these papers are discussed in Chapter 3, p 244.

  14. PHG says

    26 Nov 2009 at 7:08 PM

    Bob, (751)

    No, the longer the wait the higher the potential for significant
    economic dislocation once we are 99.9% sure of the science and are forced into draconian cuts in emissions over a short time frame.

    A lot of energy companies are actively investigating wind, goethermal and
    other projects in anticipation of transitioning to a carbon constrained world, the last thing that is needed is more uncertainty on the regulatory front.

  15. andre says

    26 Nov 2009 at 7:54 PM

    #742 and others. The programming language is nowhere near as important as the programmer in building structured code though the language can make things easier. Fortran 90+ is probably the best language for building numerical codes IMHO due to it’s array handling and it is fairly straightforward to evolve a code from fortran 77 to fortran 90+ over time. It is also fairly easy to intermix fortran, C and C++. What is more difficult is replacing working legacy code. Whenever I get the urge to rewrite some old piece of physics code, I think twice. There is a lot of “wisdom” hidden in legacy code and requires careful study to make sure it is preserved in any new code.

    Gavin, I am amazed at your patience and endurance. I enjoy reading about climate science on this web site and am thankful for all the work you put into it.

  16. Peter Pitchford says

    26 Nov 2009 at 8:08 PM

    This is like listening to Colin Powell’s tape recordings of private conversations that were intercepted and used as evidence of weapons of mass destruction. Some guy was on a walkie talkie telling some worker that the inspectors were coming so they should get rid of a pile of scrap metal because the Asshole Americans might act like it was an atomic or chemical bomb. The Americans proved that they were indeed that stupid, even stupider, when, without even looking at that pile, or having any evidence whatsoever of what was in that pile, assumed that Powell’s recording of that conversation was a smoking gun of an atomic or chemical bomb.

  17. arnold says

    26 Nov 2009 at 8:30 PM

    Gavin,

    You work at the NASA right? I hope you explain something for me, i was wondering about the paleo datasets that are available for download from NASA. One of them is a dataset comprised out off data from picking drapes in france. I came across a article recently where the guy put down a peer-reviewed article to say something against the authors off the first article. Can you explain to me why the data is still availeble, or have i been fooled in to thinking that he had a point ?

    Thanks

  18. ChrisC says

    26 Nov 2009 at 8:39 PM

    Response to RaymondT

    The equations solved in GCMs are discreetized using a technique called “Finite Volumes”. This technique uses a different form of the equations based on fluxes and conservation laws. The advantage of Finite Volumes is that mass and what-not are conserved absolutely (in finite arithmetic). The equations, despite being cast in a different form, are identical. An analogy is the Maxwell equations in electrodynamics. These are often written in integral form (which is like flux form used in Finite Volumes) or differential form (which is more commonly used when solving equations using finite difference or spectral methods).

    Finite Volumes is the technique du jour in many commercial Computational Fluid Dynamics programs (such as AnSys, starCFD ect…). I’ve written micro-scale meteorological models myself using this technique. The solution is physical.

    For an introduction to Finite Volumes, I recommend the book by Ferziger and Peric.

  19. J. Bob says

    26 Nov 2009 at 8:44 PM

    #758, Hank, your comment “Unfortunately that’s already happened, and the current order isn’t sustainable” is interesting. If your referring to man causing “global warming”, how do you explain the flattening of global temperatures and the recent rise in sea ice. So why spend trillions of $’s on some AGW theory that’s look very questionable?

    By the way, you never did respond to the comparison between the EMD (Empirical Mode Decomposition) filter and the Fourier filter (#1071 previous thread). Remember that post showed a peaking of the Hadcet and E. England data.
    http://www.imagenerd.com/uploads/cru-fig-6-NMyC0.gif
    http://www.imagenerd.com/uploads/climate4u-lt-temps-Ljbug.gif

  20. Bob says

    26 Nov 2009 at 8:48 PM

    To PHG (761)So you do not even think it is a possibility that the science is just plain wrong. Do you realize how many hundreds of millions of lives would be dislocated for wrong science? Why don’t we plan now for a direct meteor hit? You can’t be serious.

  21. J. Bob says

    26 Nov 2009 at 8:50 PM

    #762, One good way to check computer code for structure and readability, it to go back after say 5 years, and see if you can understand it, and make corrections. I liked FORTRAN, but when all is said and done, I still like BASIC the best. It may not be the fastest (due mainly to the poor compilers), but it still seems to me the most readable for maintenance.

  22. BJ_Chippindale says

    26 Nov 2009 at 9:19 PM

    Anyone trying to call the CRU being tied up by agreements with the various Met offices nonsense needs to rethink. One of the difficulties here is EXACTLY that. Everything is user-pays. The National led government likes it that way, but the result, that there is little or no transparency for the science, is “disgraceful”… even though THEY THEMSELVES SUPPORT IT. They just don’t connect the dots. Since NZ came up and I am here in NZ, I did go about the exercise of trying to work out what was going on and wound up jammed into a “put the money in the slot” page with almost all approaches (I wanted the SST around NZ).

    Fortunately someone HAD done a paper which evaluated the trends. Note that the NZClimateScience coalition paper was falsified by NIWA with the publication of data regarding station changes in Wellington. From Sea Level to 350 Meters up… it wasn’t done that well, nobody calibrated the change properly so a new site had to be set up to do calibration and it is not likely perfect, but it IS as good as it can be.

    NZClimateScience had simply asserted that there wasn’t any reason to correct the data… at all… oops.

    The Fraud is plainly there, its just all on the other side.

    Which still doesn’t let Jones off from having to go head-to-head on this. He has to grab the Bull by the Tail and face the situation.

    respectfully
    BJ

  23. Joe V. says

    26 Nov 2009 at 9:31 PM

    This resent paper seems to disput all theories on stratospheric cooling (positive feedback?). What are your thiughts Gavin?

    Stratospheric warming in Southern Hemisphere high latitudes since
    1979
    Y. Hu1 and Q. Fu2
    1Department of Atmospheric Sciences, Peking University, Beijing, China
    2Department of Atmospheric Sciences, University of Washington, Seattle, USA
    Received: 26 November 2008 – Published in Atmos. Chem. Phys. Discuss.: 16 January 2009
    Revised: 17 June 2009 – Accepted: 22 June 2009 – Published: 3 July 2009

    http://www.atmos-chem-phys.net/9/4329/2009/acp-9-4329-2009.pdf

    [Response: The pattern of stratopsheric cooling associated with increasing CO2 is a radiative effect, peaking at around 1mb, and strongest at the equator. Near the poles, dynamic can play a much stronger role than elsewhere. Having dynamical influence much lower down and at the pole, is therefore interesting, but not really contradictory. In fact our group published a paper years ago postulating a dynamic impact on polar vortex temperatures (Shindell et al, 1998). – but I think it might have gone the other way. I’ll have to go check. – gavin]

  24. Joe V. says

    26 Nov 2009 at 9:57 PM

    Also, new research on factors driving stratospheric temp. changes.

    Sudden stratospheric warmings seen in MINOS deep underground muon data. Osprey, S.M. et al., Geophys. Res. Lett., doi:10.1029/2008GL036359

    http://www.agu.org/pubs/crossref/2008/2008GL033875.shtml

    [Response: The first paper is demonstrating the opposite causality – stratospheric changes affect muon counts (not the other way around), while the second discusses the thermosphere – way, way above the stratosphere, and nothing to do with climate at all. – gavin]

  25. BJ_Chippindale says

    26 Nov 2009 at 9:58 PM

    #762

    My experience with scientists and development and coding is that they should do the algorithms but probably SHOULD leave the rest to software engineers. Two reasons. One is that it would be more efficient for them NOT to worry about mundane details like naming conventions and data organization, and the other is that those same engineers would be the ones who would put together responses to any request for data and code once the algorithms had been checked and validated. All the scientist has to say is “do it”. Much easier to delegate this stuff… and much easier to make the

    The problems Harry had included endian-ness which indicates he was hitting new hardware as well as a new OS. With legacy from a decade prior. Dialects… and pretty clearly the original authors who knew what went where and did which to those other things weren’t available to help him. Legacy code and data is one of the most nit-picky jobs there is. I’m pretty damned impressed at what he did with it.. and I have no real doubts about the result.

    respectfully
    BJ

  26. Ray Ladbury says

    26 Nov 2009 at 10:08 PM

    Anand Rajan, Dude, you really need to get help. Your lack of empathy borders on the pathological, and may well cross that border. Scientists are not machines. We cannot be expected to be dispassionate when it comes to questions of the continued existence or viability of civilization. And the thing is that the scientific method does not require us to be. Perhaps if you understood more about science, you might understand that it works without giving up our humanity.

  27. Ray Ladbury says

    26 Nov 2009 at 10:17 PM

    Gnmr,
    Whether adjustments are appropriate in the “data” (which by the time the it is climate data isn’t raw data in any case) depends on the purpose for the data. If absolute values are important, then you don’t want to make adjustments. However, if it’s long term trends, mere entries in log books ain’t gonna cut it.

  28. Joe V. says

    26 Nov 2009 at 10:23 PM

    In the spirt of today’s importance I belive a grateful thank you is in order to you Gavin. The time you have taken out of your busy schedule to not only respond but to educate is definitely above and beyond the call. I am one who would never question anyone with your qualifications as to whether or not you are on the “company’s dime” responding to personal blog posts. I don’t have access nor do I wish to have access to your time card. So thank you for helping me understand these issues. You are a prime example of scientific discussion.

    New relaesed emails seem to be in conflict with your previous post:
    [Response: They ‘hid’ the decline in a paper in Nature. How clever of them. – gavin]

    http://www.nature.com/nature/journal/v391/n6668/abs/391678a0.html

    From: Tim Osborn
    To: mann@xxxxxxxxx.xxx,imacadam@xxxxxxxxx.xxx
    Subject: Briffa et al. series for IPCC figure
    Date: Tue, 05 Oct 1999 16:18:29 +0100
    Cc: k.briffa@uea,p.jones@uea

    Dear Mike and Ian

    Keith has asked me to send you a timeseries for the IPCC multi-proxy
    reconstruction figure, to replace the one you currently have. The data are
    attached to this e-mail. They go from 1402 to 1995, although we usually
    stop the series in 1960 because of the recent non-temperature signal that
    is superimposed on the tree-ring data that we use. I haven’t put a 40-yr
    smoothing through them – I thought it best if you were to do this to ensure
    the same filter was used for all curves.

    The raw data are the same as used in Briffa et al. (1998), the Nature paper
    that I think you have the reference for already. They are analysed in a
    different way, to retain the low-frequency variations. In this sense, it
    is one-step removed from Briffa et al. (1998). It is not two-steps removed
    from Briffa et al. (1998), since the new series is simply a *replacement*
    for the one that you have been using, rather than being one-step further.

    A new manuscript is in preparation describing this alternative analysis
    method, the calibration of the resulting series, and their comparison with
    other reconstructions. We are consdering submitting this manuscript to J.
    Geophys. Res. when it is ready, but for now it is best cited as:
    Briffa KR, Osborn TJ, Schweingruber FH, Harris IC and Jones PD (1999)
    Extracting low-frequency temperature variations from a northern tree-ring
    density network. In preparation.
    Keith will be sending you a copy of the manuscript when it is nearer to
    completion.

    I have also attached a PS file showing the original Briffa et al. (1998)
    curve, with annotation of cold years associated with known volcanic
    eruptions. Overlain on this, you will see a green curve. This is the new
    series with a 40-yr filter through it. This is just so that you can see
    what it should look like (***ignore the temperature scale on this
    figure***, since the baseline is non-standard).

    With regard to the baseline, the data I’ve sent are calibrated over the
    period 1881-1960 against the instrumental Apr-Sep tempratures averaged over
    all land grid boxes with observed data that are north of 20N. As such, the
    mean of our reconstruction over 1881-1960 matches the mean of the observed
    target series over the same period. Since the observed series consists of
    degrees C anomalies wrt to 1961-90, we say that the reconstructed series
    also represents degrees C anomalies wrt to 1961-90. One could, of course,
    shift the mean of our reconstruction so that it matched the observed series
    over a different period – say 1931-60 – but I don’t see that this improves
    things. Indeed, if the non-temperature signal that causes the decline in
    tree-ring density begins before 1960, then a short 1931-60 period might
    yield a more biased result than using a longer 1881-1960 period.

    If you have any queries regarding this replacement data, then please e-mail
    me and/or Keith.

    Best regards

    Tim

    Calibrated against observed Apr-Sep temperature over 1881-1960
    averaged over all land grid boxes north of 20N

    Year Reconstructed temperature anomaly (degrees C wrt 1961-90)
    1402 -0.283
    1403 -0.334
    1404 -0.286
    1405 -0.350
    1406 -0.152
    1407 -0.124
    1408 -0.220
    1409 -0.175
    1410 -0.100
    1411 -0.129
    1412 -0.226
    1413 -0.115
    1414 -0.386
    1415 -0.319
    1416 -0.277
    1417 -0.136
    1418 -0.172
    1419 -0.294
    1420 -0.280
    1421 -0.335
    1422 -0.406
    1423 -0.312
    1424 -0.207
    1425 -0.136
    1426 -0.354
    1427 -0.222
    1428 -0.305
    1429 -0.322
    1430 -0.282
    1431 -0.143
    1432 -0.212
    1433 -0.234
    1434 -0.076
    1435 -0.309
    1436 -0.411
    1437 -0.122
    1438 -0.272
    1439 -0.159
    1440 -0.330
    1441 -0.160
    1442 -0.105
    1443 -0.080
    1444 -0.308
    1445 -0.138
    1446 -0.317
    1447 -0.270
    1448 -0.301
    1449 -0.357
    1450 -0.137
    1451 -0.183
    1452 -0.207
    1453 -0.485
    1454 -0.265
    1455 -0.358
    1456 -0.241
    1457 -0.199
    1458 -0.366
    1459 -0.397
    1460 -0.252
    1461 -0.230
    1462 -0.252
    1463 -0.209
    1464 -0.174
    1465 -0.174
    1466 -0.280
    1467 -0.256
    1468 -0.256
    1469 -0.222
    1470 -0.237
    1471 -0.094
    1472 -0.122
    1473 -0.056
    1474 -0.320
    1475 -0.376
    1476 -0.133
    1477 -0.075
    1478 0.037
    1479 -0.161
    1480 -0.379
    1481 -0.513
    1482 -0.286
    1483 -0.354
    1484 -0.327
    1485 -0.208
    1486 -0.125
    1487 -0.380
    1488 -0.193
    1489 -0.245
    1490 -0.466
    1491 -0.244
    1492 -0.146
    1493 -0.278
    1494 -0.394
    1495 -0.526
    1496 -0.275
    1497 -0.264
    1498 -0.233
    1499 -0.169
    1500 -0.128
    1501 -0.415
    1502 -0.306
    1503 0.011
    1504 -0.013
    1505 -0.378
    1506 -0.226
    1507 -0.428
    1508 -0.192
    1509 -0.312
    1510 -0.157
    1511 -0.162
    1512 -0.188
    1513 -0.135
    1514 -0.418
    1515 -0.258
    1516 -0.381
    1517 -0.134
    1518 -0.180
    1519 -0.166
    1520 -0.035
    1521 -0.384
    1522 -0.302
    1523 -0.541
    1524 -0.371
    1525 -0.183
    1526 -0.289
    1527 -0.224
    1528 -0.247
    1529 -0.432
    1530 -0.291
    1531 -0.467
    1532 -0.343
    1533 -0.586
    1534 -0.183
    1535 -0.417
    1536 -0.350
    1537 -0.257
    1538 -0.451
    1539 -0.398
    1540 -0.497
    1541 -0.406
    1542 -0.584
    1543 -0.448
    1544 -0.317
    1545 -0.312
    1546 -0.289
    1547 -0.114
    1548 -0.459
    1549 -0.335
    1550 -0.009
    1551 -0.074
    1552 -0.047
    1553 -0.207
    1554 -0.285
    1555 -0.116
    1556 -0.141
    1557 -0.419
    1558 -0.174
    1559 -0.465
    1560 -0.287
    1561 -0.169
    1562 -0.231
    1563 -0.270
    1564 -0.347
    1565 -0.116
    1566 -0.202
    1567 -0.278
    1568 -0.445
    1569 -0.488
    1570 -0.465
    1571 -0.434
    1572 -0.674
    1573 -0.324
    1574 -0.493
    1575 -0.273
    1576 -0.623
    1577 -0.483
    1578 -0.521
    1579 -0.551
    1580 -0.473
    1581 -0.436
    1582 -0.382
    1583 -0.345
    1584 -0.280
    1585 -0.565
    1586 -0.409
    1587 -0.580
    1588 -0.530
    1589 -0.534
    1590 -0.354
    1591 -0.377
    1592 -0.407
    1593 -0.337
    1594 -0.591
    1595 -0.459
    1596 -0.436
    1597 -0.475
    1598 -0.152
    1599 -0.134
    1600 -0.381
    1601 -1.169
    1602 -0.403
    1603 -0.414
    1604 -0.472
    1605 -0.393
    1606 -0.564
    1607 -0.529
    1608 -0.822
    1609 -0.789
    1610 -0.617
    1611 -0.681
    1612 -0.670
    1613 -0.364
    1614 -0.733
    1615 -0.428
    1616 -0.698
    1617 -0.479
    1618 -0.485
    1619 -0.524
    1620 -0.706
    1621 -0.671
    1622 -0.714
    1623 -0.662
    1624 -0.387
    1625 -0.566
    1626 -0.671
    1627 -0.665
    1628 -0.759
    1629 -0.654
    1630 -0.379
    1631 -0.466
    1632 -0.330
    1633 -0.377
    1634 -0.521
    1635 -0.222
    1636 -0.265
    1637 -0.252
    1638 -0.396
    1639 -0.382
    1640 -0.400
    1641 -1.152
    1642 -1.067
    1643 -1.092
    1644 -0.649
    1645 -0.588
    1646 -0.632
    1647 -0.554
    1648 -0.368
    1649 -0.572
    1650 -0.215
    1651 -0.317
    1652 -0.529
    1653 -0.268
    1654 -0.343
    1655 -0.400
    1656 -0.372
    1657 -0.332
    1658 -0.359
    1659 -0.182
    1660 -0.260
    1661 -0.258
    1662 -0.433
    1663 -0.433
    1664 -0.353
    1665 -0.440
    1666 -0.837
    1667 -0.857
    1668 -0.816
    1669 -0.779
    1670 -0.871
    1671 -0.463
    1672 -0.434
    1673 -0.631
    1674 -0.663
    1675 -0.870
    1676 -0.523
    1677 -0.670
    1678 -0.794
    1679 -0.768
    1680 -0.701
    1681 -0.380
    1682 -0.518
    1683 -0.364
    1684 -0.369
    1685 -0.688
    1686 -0.178
    1687 -0.481
    1688 -0.351
    1689 -0.229
    1690 -0.254
    1691 -0.221
    1692 -0.545
    1693 -0.263
    1694 -0.316
    1695 -0.955
    1696 -0.816
    1697 -0.687
    1698 -1.054
    1699 -1.005
    1700 -0.630
    1701 -0.818
    1702 -0.510
    1703 -0.377
    1704 -0.420
    1705 -0.527
    1706 -0.328
    1707 -0.257
    1708 -0.465
    1709 -0.493
    1710 -0.288
    1711 -0.344
    1712 -0.345
    1713 -0.242
    1714 -0.390
    1715 -0.305
    1716 -0.390
    1717 -0.309
    1718 -0.270
    1719 -0.194
    1720 -0.110
    1721 -0.427
    1722 0.005
    1723 -0.193
    1724 -0.249
    1725 -0.497
    1726 -0.381
    1727 -0.241
    1728 -0.133
    1729 -0.261
    1730 -0.633
    1731 -0.723
    1732 -0.426
    1733 -0.371
    1734 -0.104
    1735 -0.373
    1736 -0.330
    1737 -0.206
    1738 -0.557
    1739 -0.291
    1740 -0.734
    1741 -0.594
    1742 -0.808
    1743 -0.378
    1744 -0.372
    1745 -0.418
    1746 -0.501
    1747 -0.150
    1748 -0.389
    1749 -0.328
    1750 -0.168
    1751 -0.343
    1752 -0.227
    1753 -0.218
    1754 -0.377
    1755 -0.328
    1756 -0.221
    1757 -0.259
    1758 -0.431
    1759 -0.340
    1760 -0.335
    1761 -0.261
    1762 -0.466
    1763 -0.291
    1764 -0.473
    1765 -0.378
    1766 -0.212
    1767 -0.429
    1768 -0.544
    1769 -0.343
    1770 -0.341
    1771 -0.265
    1772 -0.547
    1773 -0.421
    1774 -0.048
    1775 -0.289
    1776 -0.186
    1777 -0.288
    1778 -0.178
    1779 -0.550
    1780 -0.339
    1781 -0.251
    1782 -0.164
    1783 -0.757
    1784 -0.142
    1785 -0.141
    1786 -0.179
    1787 -0.432
    1788 -0.207
    1789 -0.235
    1790 -0.612
    1791 -0.163
    1792 -0.086
    1793 -0.023
    1794 -0.030
    1795 -0.243
    1796 -0.028
    1797 -0.565
    1798 -0.049
    1799 -0.228
    1800 -0.287
    1801 -0.413
    1802 -0.117
    1803 0.020
    1804 0.036
    1805 -0.094
    1806 -0.251
    1807 -0.089
    1808 -0.241
    1809 -0.460
    1810 -0.582
    1811 -0.353
    1812 -0.459
    1813 -0.545
    1814 -0.458
    1815 -0.588
    1816 -0.855
    1817 -0.861
    1818 -0.629
    1819 -0.680
    1820 -0.289
    1821 -0.351
    1822 -0.159
    1823 -0.246
    1824 -0.276
    1825 -0.263
    1826 -0.140
    1827 -0.293
    1828 -0.033
    1829 -0.087
    1830 -0.173
    1831 -0.045
    1832 -0.621
    1833 -0.660
    1834 -0.141
    1835 -0.647
    1836 -0.775
    1837 -0.771
    1838 -0.359
    1839 -0.267
    1840 -0.144
    1841 -0.077
    1842 -0.337
    1843 -0.435
    1844 -0.101
    1845 -0.412
    1846 0.106
    1847 -0.079
    1848 -0.346
    1849 -0.393
    1850 -0.261
    1851 -0.165
    1852 -0.100
    1853 -0.174
    1854 -0.138
    1855 -0.418
    1856 -0.250
    1857 -0.538
    1858 -0.126
    1859 -0.195
    1860 -0.231
    1861 -0.029
    1862 -0.555
    1863 -0.303
    1864 -0.407
    1865 -0.256
    1866 -0.437
    1867 -0.413
    1868 -0.119
    1869 -0.321
    1870 -0.213
    1871 -0.352
    1872 -0.163
    1873 -0.183
    1874 -0.372
    1875 -0.247
    1876 -0.487
    1877 -0.192
    1878 0.120
    1879 -0.152
    1880 -0.346
    1881 -0.184
    1882 -0.200
    1883 -0.183
    1884 -0.717
    1885 -0.534
    1886 -0.485
    1887 -0.281
    1888 -0.261
    1889 -0.153
    1890 -0.341
    1891 -0.313
    1892 -0.138
    1893 -0.301
    1894 -0.134
    1895 -0.128
    1896 -0.241
    1897 -0.016
    1898 0.065
    1899 -0.574
    1900 -0.218
    1901 -0.049
    1902 -0.287
    1903 -0.142
    1904 -0.205
    1905 -0.308
    1906 -0.034
    1907 -0.412
    1908 -0.048
    1909 -0.214
    1910 -0.147
    1911 -0.194
    1912 -0.631
    1913 -0.161
    1914 -0.294
    1915 -0.074
    1916 -0.277
    1917 -0.297
    1918 -0.460
    1919 -0.013
    1920 -0.272
    1921 -0.114
    1922 -0.036
    1923 -0.305
    1924 -0.141
    1925 -0.258
    1926 -0.115
    1927 -0.198
    1928 -0.018
    1929 -0.161
    1930 0.086
    1931 0.104
    1932 0.081
    1933 -0.057
    1934 0.007
    1935 -0.037
    1936 -0.019
    1937 0.060
    1938 0.163
    1939 -0.075
    1940 0.113
    1941 -0.200
    1942 0.128
    1943 0.053
    1944 -0.080
    1945 0.059
    1946 -0.016
    1947 -0.188
    1948 -0.038
    1949 -0.107
    1950 -0.269
    1951 -0.100
    1952 -0.118
    1953 0.161
    1954 -0.235
    1955 -0.127
    1956 -0.308
    1957 -0.194
    1958 -0.308
    1959 -0.224
    1960 0.076
    1961 -0.104
    1962 -0.289
    1963 -0.173
    1964 -0.479
    1965 -0.474
    1966 -0.171
    1967 -0.200
    1968 -0.599
    1969 -0.355
    1970 -0.353
    1971 -0.328
    1972 -0.563
    1973 -0.262
    1974 -0.336
    1975 -0.507
    1976 -0.558
    1977 -0.363
    1978 -0.698
    1979 -0.289
    1980 -0.612
    1981 -0.195
    1982 -0.522
    1983 -0.234
    1984 -0.335
    1985 -0.423
    1986 -0.430
    1987 -0.424
    1988 -0.161
    1989 -0.286
    1990 -0.275
    1991 -0.169
    1992 -0.175
    1993 -0.341
    1994 -0.320

    Attachment Converted: “c:eudoraattachBriffa et al.ps”

    Dr Timothy J Osborn | phone: +44 1603 592089
    Senior Research Associate | fax: +44 1603 507784
    Climatic Research Unit | e-mail: t.osborn@xxxxxxxxx.xxx
    School of Environmental Sciences | web-site:
    University of East Anglia __________| http://www.cru.uea.ac.uk/~timo/
    Norwich NR4 7TJ | sunclock:
    UK | http://www.cru.uea.ac.uk/~timo/sunclock.htm

  29. Joe V. says

    26 Nov 2009 at 10:33 PM

    Sorry for not attachin the code.

    BRIFFA VERSION

    url<-"ftp://ftp.ncdc.noaa.gov/pub/data/paleo/treering/reconstructions/n_hem_temp/briffa2001jgr3.txt&quot;
    #readLines(url)[1:50]
    Briffa<-read.table(url,skip=24,fill=TRUE)
    Briffa[Briffa< -900]=NA
    dimnames(Briffa)[[2]]<-c("year","Jones98","MBH99","Briffa01","Briffa00","Overpeck97","Crowley00","CRU99")
    sapply(Briffa, function(x) range( Briffa$year[!is.na(x)]) )
    # year Jones98 MBH99 Briffa01 Briffa00 Overpeck97 Crowley00 CRU99
    #[1,] 1000 1000 1000 1402 1000 1600 1000 1871
    #[2,] 1999 1991 1980 1960 1993 1990 1987 1997
    Briffa= ts(Briffa,start=1000)

    Hacked file version

    loc="http://www.eastangliaemails.com/emails.php?eid=146&filename=939154709.txt&quot;
    working=readLines(loc,n=1994-1401+104)
    working=working[105:length(working)]
    x=substr(working,1,14)
    writeLines(x,"temp.dat")
    gate=read.table("temp.dat")
    gate=ts(gate[,2],start=gate[1,1])

    #Comparison
    briffa=ts.union(archive= Briffa[,"Briffa01"],gate )
    briffa=window(briffa,start=1402,end=1994) #
    plot.ts(briffa)

    X=briffa

    plot( c(time(X)),X[,1],col=col.ipcc,lwd=2,ylim=c(-1.2,.5),yaxs="i",type="n",axes=FALSE,xlab="",ylab="")
    for( i in 2:1) lines( c(time(X)),X[,i],col=i,lwd=1)
    axis(side=1,tck=.025)
    labels0=seq(-1,1,.1);labels0[is.na(match(seq(-1,1,.1),seq(-1,1,.5)))]=""
    axis(side=2,at=seq(-1,1,.1),labels=labels0,tck=.025,las=1)
    axis(side=4,at=seq(-1,1,.1),labels=labels0,tck=.025)
    box()
    abline(h=0)
    title("Hide the Decline")
    legend("topleft",fill=2:1,legend=c("Deleted","Archived"))

    [Response: The data ‘showing the decline’ is in the file: ftp://ftp.ncdc.noaa.gov/pub/data/paleo/treering/reconstructions/n_hem_temp/nhemtemp_data.txt (Briffa et al, 1998) (different scaling and baseline though). Nothing was ‘hidden’. – gavin]

  30. Jim Galasyn says

    26 Nov 2009 at 10:52 PM

    Simon Abingdon, speaking from the land of Faery: You have to structure your programs properly from the outset. You can’t incorporate good practice retrospectively. Every “spaghetti” program has untold errors waiting to rear their ugly heads (or remain forever unsuspected while erroneous results are acted upon with misplaced confidence).

    Code doesn’t have to be provably correct to be useful. Else we would have no commercial software industry to speak of. Or everything would be written in a declarative language; have fun with that.

    My experience has been that academic code tends to be less maintainable/well-factored/engineered than code in the private sector. I have it on good authority that little of Carnegie Mellon’s code has a CMMI maturity level above 1. This doesn’t mean those products are useless. Science and most of industry would grind to a halt if you were to require level-5 code.

  31. Dayahka says

    26 Nov 2009 at 11:08 PM

    Many important issues have been raised and discussed in this and related blogs, but I think we need to spend a great deal more time discussing what I think is the most important and debatable point made in this blog. I refer to Gavin’s response in #122 to the question of why we do not want the globe to warm up any further, where Gavin says:

    …[S]ociety is adapted to the climate we have (though not perfectly of course). Where we build cities, where we grow crops, how we live are all based on assumptions about the climate – the temperature, the rainfall, the tides, the expected storm surges, the variability etc. Changes to that background will likely require shifts to how we do all of those things and some changes will be hugely expensive in terms of people and infrastructure to adapt to – even if that is possible. Coupled with the uncertainty of exactly a planet that would be warmer (say 3ºC) than at any time in the last 3 million years would look like (and note that sea levels were ~60 ft higher then!), I don’t want to take the risk.

    Let’s assume that the science of climate is mature, robust, and can support predictability.

    1. Whose society? The Western world? The developed nations? The Saharan? The Tibetan? Why should Greenland remain snow-bound? or Antarctica? At least in America, our infrastructure is collapsing and we’re going to have to either repair or, better, re-design and implement much better infrastructure anyway.

    2. Assuming that what you mean is the status quo, what we have now, do you mean to suggest that everyone would agree that we have the “best of all possible worlds” now and that no shift towards either cooling or warming could make any of it better? Why must the seashore remain where it is? Why would it not be better to have the sea level 2, 10, 60 meters higher or lower than now? Why must our cities remain where they are? Why must we freeze time and change? Why is it better to try to revert to an earlier age than to adapt to a new one?

    [Response: Who want’s to revert to an earlier age? (How is new solar technology going back to Middle Ages?). But to answer your question, there are 100’s of millions of people living less than 1 m about sea level. Billions of peoples within 6m. Trillions and trillions of dollars of infrastructure. Sea level rise is a lose for everyone, there are no benefits. Just adaptation costs that increase dramatically as the rise becomes more rapid. – gavin]

    3. Can we maintain the status quo with declining production of oil (Peak Oil)? Are our current cities, agriculture, transportation modes and so on sustainable when oil is unavailable?

    [Response: Don’t know. But there is a lot of fossil fuel in non-traditional sources (oil sands, tar shales, methane hydrates etc.). – gavin]

    4. If we were able to limit anthropogenic warming and CO2 to 350ppm, would we then maintain this status quo? Will there be any sort of time lag between 350ppm and the “ideal” climate you wish to retain? Might it get quite a bit hotter before it cools down, thus forcing us to change our cities, infrastructure and so on anyway?

    [Response: Depends on the trajectory, and sea level is very much a lagging indicator. – gavin]

    5. Suppose it started to get demonstrably cooler (after all, we are in an interglacial that should have ended some time ago), would increasing CO2 then be your recommendation? Is your climate model sufficiently accurate and precise to be able to support the idea that CO2 causes cooling or warming?

    [Response: You don’t climate models to understand why CO2 causes warming – basic physics and a little bit of spectroscopy does the trick nicely. If we were cooling for some reason, increasing CO2 would still not be smart. Much more powerful GHGs are available that don’t interfere with ocean chemistry or the carbon cycle (various HFCs etc.). But this is not a subject of current concern. – gavin]

  32. John Magurn says

    26 Nov 2009 at 11:28 PM

    One group of scientista maintain they have data that proves there is a warming trend while othe scientists maintain there has been no warming since 2000 and cooling since 2005. Some cite Arctic ice retreat while others cite Antarctic ice growth. Since both sides have the same data with which to work, I can only conclude global warming/cooling trends are based on opinion and not the facts. Therefore, it would be foolhardy to take actions based on this inconclusive “science”.

    [Response: I suggest you look at the data yourself. – gavin]

  33. EL says

    26 Nov 2009 at 11:31 PM

    On GOTO statements:

    Just about every single computer program uses GOTO statements. Although most modern languages attempt to restrict the statement from being used by programmers, the compiler is still using GOTO statements behind the scenes. The GOTO statement is equivalent to the JMP statement in assembly language. There are certain problems where a GOTO statement is the best solution. For example, a GOTO statement may offer a quick way out of a very deep nest. In fact, many languages offer restricted GOTO commands for such purposes. For example, C++ offers the break command so that programmers can terminate a loop early. The break command is essentially a goto statement, and will be a jump statement in assembly. Also, some switch statement optimizations by compilers create a jump table in order to speed things up.

    “Furthermore, just because code is well written does not mean that it does not hide bugs”

    The proper way to code is never done by anyone. Quite a few computer scientists have been bitching about it. All bugs in software can be completely and totally eliminated by using mathematical rigor. Nobody proves their code; as a result, software is plagued with bugs and security flaws. The proper way to code is to prove the entire code mathematically. Comments are also nice for keeping up with direction and flow of the program.

  34. AC says

    26 Nov 2009 at 11:46 PM

    [Response: National Met Offices around the world (though not the US) often have a mandate to commercialize their data and their forecasts. Thus they often provide academics with data on the understanding that it won’t be passed on to third parties which might undercut their commercial services. I think this is mostly unnecessary, but it is hard to change. Write to your representatives and tell them. – gavin]

    I think it’s clear from the emails that at least some scientists at CRU expressed an interest in actively searching for reasons to deny FOI requests.

    Your claim would carry much more weight if you could provide a link to a statement from a decision maker at just one of these National Met offices to the effect that, yes, they explicitly precluded the further distribution of data that they had provided, or that if the data they provided was redistributed, that they would no longer provide such data. Without that, surely you realize that you run the risk of being seen in the same light that these emails are – as someone looking for a reason to say no.

  35. conversefive says

    27 Nov 2009 at 12:57 AM

    “Distribution for Endorsements —
    I am very strongly in favor of as wide and rapid a distribution as possible for endorsements. I think the only thing that counts is numbers. The media is going to say “1000 scientists signed” or “1500 signed”. No one is going to check if it is 600 with PhDs versus 2000 without. They will mention the prominent ones, but that is a different story.”

    Do you have a link to a document which lists the either 1000, 600, 1500 or 2000 endorsers referenced in the above section of email? I would like to check their credentials.

    [Response: This seems to be related to possible statement prior to the Kyoto discussions in 1997. I have no idea whether this was set up or who signed it. Sorry – just too long ago. – gavin]

  36. Mark Sawusch says

    27 Nov 2009 at 1:22 AM

    correction to the first question asked in my post #662:

    1. why didn’t the papers reveal the “fudge factor”(s)- not my words the programmer’s- used to produce the calibration of tree ring density? They happen to show an almost exponential rise post 1939 (granted level with the highest “fudge factor” between 1979-1999). I find no justification or explanation of this.

    I made at least 3 errors interpreting the code in briffa_sep98_d.pro per my original post:

    errors 1&2. the program data sets actually appear to start in 1400 with the first containing all data 1400-1903, so here are the 19 remaining consecutive data subsets by first year, with the “fudge factors” and the “fudge factors”*.75 which is really what was used:

    Year Fudge Factor Fudge Factor*.75
    1400 0 0
    1904 0 0
    1909 0 0
    1914 0 0
    1919 0 0
    1924 -0.1 -0.075
    1929 -0.25 -0.1875
    1934 -0.3 -0.225
    1939 0 0
    1944 -0.1 -0.075
    1949 0.3 0.225
    1954 0.8 0.6
    1959 1.2 0.9
    1964 1.7 1.275
    1969 2.5 1.875
    1974 2.6 1.95
    1979 2.6 1.95
    1984 2.6 1.95
    1989 2.6 1.95
    1994 2.6 1.95

    3. the program does print only the “Northern Hemisphere MXD” without the “VERY ARTIFICIAL CORRECTION FOR THE DECLINE.” The command to plot the “fudged” data is commented out. But again, I ask, referring to the text in the second paper that this is admittedly a “rather” “ad hoc” approach my question #3: Although the 2 papers only mention “adjusting” the data for purposes of obtaining a “final calibration”, is it not true that when a “VERY ARTIFICIAL” adjustment is used to create a “VERY ARTIFICIAL” calibration, and this “VERY ARTIFICIAL” calibration is then applied to the raw data, what you end up with is “VERY ARTIFICIAL” data? i.e. GIGO? Just asking.

    [Response: I have it on good authority that this ‘correction’ was never used in any paper, and was simply an internal test to see what it looked like. Thus the strong language in the code warning against it’s use, the fact that application of this correction is commented out, and indeed, it’s absence in the literature. – gavin]

  37. Hank Roberts says

    27 Nov 2009 at 1:30 AM

    > your claim would carry much more weight if you could provide a
    > link to a statement from a decision maker at just one of these
    > National Met offices

    Are you perhaps suggesting a fishing expedition in their correspondence?
    You have seen CRU’s statement that they are trying to get permission.
    What more do you think anyone here at RC can provide you, and how?

  38. Duae Quartunciea says

    27 Nov 2009 at 1:38 AM

    In this comment (now #770), AC says:

    I think it’s clear from the emails that at least some scientists at CRU expressed an interest in actively searching for reasons to deny FOI requests.

    For certain vexatious FOI requests, yes, they did. As long as the reasons are legitimate and legal, what is wrong with that? Hint for those who need it spelled out: NOTHING.

    See Vexatious and repeated requests; a guide at the ministry of justice. You can’t just declare something to be vexatious, of course; it still takes time to evaluate everything properly. But checking for this and looking for how to limit the scope of sweeping requests is perfectly ethical and legal, and the problem is recognized in the legislation.

    This is certainly not the same as seeking to deny all FOI requests; nor can that be presumed. The problem these scientists faced was a massive flood of poorly presented FOI requests from closely related applicants, and for which there was good reason to assume bad faith on the part of applicants and that the requests were part of a wider deliberate harassment campaign. Many demands were manifestly excessive and with so many of them all at once, looking for reasons to reject them is the right thing to do; as long as it is all done in proper compliance with the legislation and with legal obligations to comply with a valid request.

    The CRU has been working to try and make the datasets more available; but they are constrained by their own legal obligations to other data providers. The issues with distributing all the data have been clear from the start, and have quite properly meant that many requests were denied; not that this has preventing ongoing repetitions, appeals and wider fishing expeditions for material of highly dubious relevance — such as sweeping requests for correspondence. There is no automatic presumption that all FOI requests must be actively encouraged; and there are proper procedures for managing them which were followed.

    Many of the internet discussions appear to presume that any request under FOI must be legitimate and that any failure to give anyone precisely what they want at once brings with it a presumption of guilt or impropriety. That is ridiculous — like so much else about this whole stolen files affair.

    The elephant in the room which is fundamental if anyone is serious about ethics is that there has been a campaign of harassment, even before the harassment extended itself into the outright felonious theft of files.

    The University of East Anglia has been overwhelmed with a flood of improper requests which have been properly processed and mostly denied, but which still involve a lot of wasted work and effort in the procedures to evaluate everything. The University is legally and ethically obliged to evaluate and handle all of these properly; and we have no indication whatever that these has not occurred.

    But there is also an ethical issue with misusing the FOI legislation to harass or disrupt an organization… and that has undoubtedly been the case here.

  39. Robert R. Reynolds says

    27 Nov 2009 at 1:45 AM

    As a geologist, I have a much longer perspective of weather than climate science. Admittedly, my understanding is heavily dependent on proxy data. I became an AGW skeptic the moment I read the Kyoto Protocols because I knew that CO2 content in our atmosphere had reached levels more than 10 times that today. The Mesozoic was for the most part much warmer than today. Life on land thrived and evolved and there is no fossil recored of extinctions until the meteor strike 65 my ago. Then there is the world wide temperature decline beginning 50my ago (attributed to plate tectonics). Ice formed at the poles beginning around 14 my ago. The Pleistocene ice age began 1.75 my on in the N. Hemisphere. The 6th glacial stage, Wisconsin, melted back beginning only 13,000 years ago, a mere moment of geologic time. We are now in the 6th interglacial of the Pleistocene. Technically we’re still in a glacial climate. Until we can understand the Pleistocene and its intermittent stages,you are lost in the noise of a probable normal interglacial weather pattern in which the periods of warming appear to be decreasing in intensity. I see no reason to assume our ice age is over and I believe you have wasted a huge chunk of the worlds money on the supposition you can micromanage a geologic process

    [Response: The huge release of fossil carbon at rates ~150 times faster than it was accumulated is not a ‘geologic’ process. Closest Cenozoic analog is possibly the PETM. That doesn’t give you even a moment’s pause? – gavin]

  40. information=data says

    27 Nov 2009 at 2:03 AM

    yep:

    http://www.futerra.co.uk/downloads/RulesOfTheGame.pdf

  41. Ron R. says

    27 Nov 2009 at 2:49 AM

    Anand, just so you know, I do agree with your basic premise that science needs to always strive to be objective and accurate. That its conclusions should not be driven by pre-conceived notions nor influenced by emotion.

    The thing is that climate science meets this criterion as best I can tell, and no one, despite enormous effort, has been able to show otherwise. And it’s science is such that it has, so far, stood the test of time.

    Does that mean, though, that climate scientists should (even if they could) be emotionless automatons? That they shouldn’t care about the wonderful planet that they have learned, and continue to learn, so much about? Of course not. Every breakthrough in science was, I’m sure, welcomed by it’s discoverers with emotion, all cared about their work, and that care did not lessen the science one bit.

    You say, “A common misconception is to view humankind to be the custodian of all life on earth.”

    Were we meant to be the custodian of all life on earth? I don’t think so. I do believe though that because we have and are causing so much destruction we should, if we have any sense of decency, do what we can to rectify the situation. Not only that, if just for selfish reasons alone we want to continue on this planet it would be wise to change our destructive habits.

    I believe that we modern humans have no idea what we have already lost. As an example here’s a comment from French explorer Pierre Esprit Radisson written in around 1652 in his description of his journey across parts of the future United States: “The further we sojourned the delightfuller the land was to us. I can say that in my lifetime I never saw a more incomparable country….The country was so pleasant, so beautiful and fruitful that it grieved me to see the world could not discover such enticing countries to live in. This I say because the europeans fight for a rock in the sea against each other, or for a sterile and horrid country. Contrariwise, these kingdoms are so delicious and under so temperate a climate, plentiful of all things, the earth bringing forth its fruit twice a year, the people live long and lusty and wise in their way.” How much has changed since then. If we go back even further, 10 or 12 thousand years ago, pre-holocene extinction we’d find the earth a wonderous place. Anything but dull.

    Today though we are disconnected from nature and thus it’s no wonder that so many people seem to have no feeling for it. Check out this page, http://tinyurl.com/ybsfz6d.

    You don’t seem to think that life is worth living and to me that’s sad. As if instinctually most living things seem to want to live and will fight to survive if it come to that. Why? The triumph of life on earth, science shows, has been one of evolving in order to survive an ever changing planet. Again why?

    Could I recommend that you, if possible, at least temporarily toss your cynicism, find some local trails and start hiking? With time you might begin to find that there is more to nature than meets the eye. You might begin to understand why some people fight so hard to preserve it. That in addition to a brain we also have a heart.

  42. aceq says

    27 Nov 2009 at 3:01 AM

    J. Bob re: “Remember that post showed a peaking of the Hadcet and E. England data.”

    No, to me it haven’t! I work with time series every day (market price data) and I see you approach the data simply wrong: you can’t use 14 years average (no matter how you call it) to claim anything about longer time trends. And you can’t watch a graph of only a few years to claim anything.

    You don’t need any program or advanced calculation to see from the “raw” data that:

    1) we’re in the hottest period of all since long ago

    3) there’s no indication of the long-time tendency of trend to change.

    If you look at any plot, it’s obvious that short-term (14 years are very short term) there can be downward movements but they really don’t mean anything for longer term. Last hundred years significantly raised the average temperature. Short term, it’s normal that there can be oscillations, but that means nothing to the longer trend. You didn’t “show” anything.

  43. ccpo says

    27 Nov 2009 at 3:41 AM

    omment by Joe V. — 26 November 2009 @ 10:23 PM

    I’m no scientist, but what do you see there? I see nothing. Enlighten me.

  44. sHx says

    27 Nov 2009 at 5:07 AM

    Halldór Björnsson of the Icelandic Met. Service:
    “The ideal, that all data should be free and open is unfortunately not adhered to by a large portion of the meteorological community. Probably only a small portion of the CRU data is “locked” but the end effect is that all their data becomes closed. It is not their fault, and I am sure that they dislike them as much as any other researcher who has tried to get access to all data from stations in region X in country Y.

    These restrictions end up by wasting resources and hurting everyone. The research community (CRU included) and the public are the victims. If you don’t like it, write to you NMSs and urge them to open all their data.”

    That the CRU cannot release all of its data due to confidentiality agreements with various national meteorological authorities has been mentioned many times by Gavin, as well as many other commenters elsewhere. Apologies if the following questions have been asked before, but I am very curious to know answers to these questions:

    1- Why should the whole data set be closed for examination by other researchers if only a small portion of it is “locked” by confidentiality agreements? As a lay person, I can see that there may be problems with incomplete data, but I cannot understand why the data not covered by commercial confiendiality agreements cannot be released.

    2- What percentage of data is “locked”, and what percantage of all the national meteorological authorities prevent the CRU from releasing their data? These questions are relevant because -and I am speculating here- if only 10 percent of the data is covered by confidentiality agreements with 10 or 20 national meteorological offices, the CRU’s excuse for not making it available weakens considerably. I am wondering whether the cost of breaching such an agreement, following an FOI request, is so much greater than its potential benefits to climate science.

    3- Which national meteorological offices prevent the CRU from releasing its data? Has the CRU mentioned the name of any particular national met office so far that makes its data available only on commercial basis? All that the CRU has to do is release whatever data not covered by commercial confidentiality agreements AND provide the names of the other met offices with which it has such agreements. The missing part could then be purchased separately by other researchers who want to conduct a review of the science produced by the CRU.

    4- And now the question that I’m most curious about. How could there be more than 20 years of international cooperation in the past regarding climate change, and a potentially binding agreement next month as a result, there has been neither international cooperation nor an agreement between national meteorological authorities to make their data freely available to all researchers?

    It is quite astonishing that Halldór Björnsson of the Icelandic Met. Service would advise us to write to our NMSs and “urge them to open all their data”. This is something the scientists who are now urging action on climate change should have done themselves decades ago. That there is not a simple, international “open data” agreement in place already is a finding I did not expect. Climatatologists could learn something from genetic scientists who made the Human Genome Project such a shining example of international scientifc cooperation.

  45. Mark says

    27 Nov 2009 at 5:31 AM

    Hi,

    Well as a result of the other thread on the email hack I have a question about global warming ( I could have posted there but with 778 posts I suspect noone would notice my question).

    [Response: moved back to be on-topic. sorry – gavin]

    There is something I do not understand about this business with using tree-rings as temperature proxies.

    From what I can tell there is this “divergence problem” where the tree rings do not reproduce the measured temperature (with thermometers) after 1960. I found the 1998 Nature paper on this which seems to say noone really understand why this is and presents some speculation as to the cause.

    Now my question is given all of this – why should I believe the tree rings are a good proxy from the year 1000-1850 (I think 1850 is when the temperature record starts) if we do not understand why it does not work from 1960-2009? How can we be sure the same problem is not present at earlier times thus rendering the reconstructed temperature record invalid?

    Looking in wikipedia I see a hockey stick graph with 10 different temperature reconstructions – so I guess many independent groups have made this graph and got compatible anwsers? Are a sizable fraction of these graphs *not* using the tree-ring proxy, but some other more reliable proxy?

    Does each graph just use one proxy or multiple proxys? If the latter why include the tree-ring one at all if it is not reliable and instead just use all the others if they are more robust proxys?

    Thanks,

    Mark

  46. simon abingdon says

    27 Nov 2009 at 5:40 AM

    #729 “Computer scientists and applied computer scientists (programmers) are two very different animals. Many programmers are autodidacts… Programmers tend to confuse mathematical models with their idea of programming.”
    #747 “Poorly-structured code is harder to read, harder to debug, harder to maintain but none of this makes it *impossible* for poorly-structured code to work properly.
    #752 “what makes it practically impossible to thoroughly debug a program is complexity, particularly where there is the potential such interaction between different parts of the program that the possible sequences of events grows super-exponentially”
    #756 “Regarding spaghetti code, the vast majority of software of any kind is badly written. And yet it works (more or less). Furthermore, just because code is well written does not mean that it does not hide bugs.”
    #762 “The programming language is nowhere near as important as the programmer in building structured code”
    #772 “My experience with scientists and development and coding is that they should do the algorithms but probably SHOULD leave the rest to software engineers.”
    #776 “My experience has been that academic code tends to be less maintainable/well-factored/engineered than code in the private sector.”
    #777 “The proper way to code is never done by anyone.” “… software is plagued with bugs and security flaws.”

    All a bit worrying really.

  47. Jaydee says

    27 Nov 2009 at 5:48 AM

    “All bugs in software can be completely and totally eliminated by using mathematical rigor. ”

    Well, up to a point. If you prove your code, you know it is doing what the spec says it should be doing. It doesn’t mean that the spec was correct.

  48. berkeley_statistician says

    27 Nov 2009 at 5:56 AM

    Dear Gavin,

    I would like to understand your claim repeated above that the data wasn’t Jones’ to give. In the emails Jones is directly quoted as saying that he’d rather delete the data than release it, and that others should “hide behind” various things in order to not release data.

    1107454306.txt 1106338806.txt

    [Response: Not every FOI request is valid and other laws and various exemptions also need to be taken into account. Plus after ~100 FOI requests, most of which are asking for the same data that was justifiably rejected back in 2007, I think it is understandable that Jones was frustrated and defensive. This isn’t necessarily the ‘proper’ reaction but I see no evidence that this had any material effect. – gavin]

  49. Didactylos says

    27 Nov 2009 at 5:59 AM

    All bugs in software can be completely and totally eliminated by using mathematical rigor. Nobody proves their code; as a result, software is plagued with bugs and security flaws. The proper way to code is to prove the entire code mathematically.

    This made me laugh! Clearly you are unfamiliar with the halting problem, and any number of other undecidable problems. If all software was “provable”, then all our lives would be a good deal simpler.

    Technicalities aside, it is true that small parts of software can be rigorously analysed and tested. If NASA, Hadley, and others had unlimited resources, I’m sure they would invest more in regression testing. But, given that the model code is mostly public, sniping is pointless. If you have time or money to spare, put it where it will do most good.

    Personally, I differentiate between code used to solve a problem, and code that is intended for wide release and everyday use. It annoys me no end when academics venture into the popular software arena, and don’t see any problem in using the same practices and techniques that they would use in the lab. But this gripe has nothing to do with climate modelling. I just wish more people were able to make this distinction.

  50. antonyclark says

    27 Nov 2009 at 6:43 AM

    on GOTO (re EL, 777)
    I use a programming language that doesn’t have GOTO statements. It does have a BREAK statement, but it does not have a RETURN statement.
    A BREAK statement is NOT essentially the same as a GOTO or RETURN. A BREAK statement takes you to the end of the scope of a loop, it does not take you out of the scope of the loop. Consequently, a BREAK statement does not cause a change of scope. GOTOs and RETURNs can take to almost anywhere.

« Older Comments
Newer Comments »

Primary Sidebar

Search

Search for:

Email Notification

get new posts sent to you automatically (free)
Loading

Recent Posts

  • The most recent climate status
  • Unforced variations: May 2025
  • Unforced Variations: Apr 2025
  • WMO: Update on 2023/4 Anomalies
  • Andean glaciers have shrunk more than ever before in the entire Holocene
  • Climate change in Africa

Our Books

Book covers
This list of books since 2005 (in reverse chronological order) that we have been involved in, accompanied by the publisher’s official description, and some comments of independent reviewers of the work.
All Books >>

Recent Comments

  • Piotr on Unforced variations: May 2025
  • William on The most recent climate status
  • Mr. Know It All on Unforced variations: May 2025
  • Piotr on The most recent climate status
  • Nigelj on Unforced variations: May 2025
  • Kevin McKinney on Unforced variations: May 2025
  • Kevin McKinney on The most recent climate status
  • Kevin McKinney on The most recent climate status
  • Kevin McKinney on The most recent climate status
  • Mr. Know It All on The most recent climate status
  • K on Unforced variations: May 2025
  • Tomáš Kalisz on Unforced variations: May 2025
  • Tomáš Kalisz on Unforced variations: May 2025
  • Piotr on Unforced variations: May 2025
  • Piotr on Unforced variations: May 2025
  • Susan Anderson on Unforced variations: May 2025
  • Ken Towe on The most recent climate status
  • Keith Woollard on The most recent climate status
  • Dan on Unforced variations: May 2025
  • Nigelj on The most recent climate status

Footer

ABOUT

  • About
  • Translations
  • Privacy Policy
  • Contact Page
  • Login

DATA AND GRAPHICS

  • Data Sources
  • Model-Observation Comparisons
  • Surface temperature graphics
  • Miscellaneous Climate Graphics

INDEX

  • Acronym index
  • Index
  • Archives
  • Contributors

Realclimate Stats

1,365 posts

11 pages

243,163 comments

Copyright © 2025 · RealClimate is a commentary site on climate science by working climate scientists for the interested public and journalists.