RealClimate

Comments

RSS feed for comments on this post.

  1. My comment as an Engineer, not a scientist:

    Has any work been done to study the biological response to higher levels of CO2? I know as a woodworker, the wood grains are inferior from the present (Compared to 100 years ago) because the growth rates of the trees. The growth rings are so wide, the wood isnâ??t as hard. I would think since plants feed mainly on CO2, this would have a huge effect. I know just the volume of water that tree’s introduce into the atmosphere is astounding. It maybe relatable to the increases in rain and storms. Could the amount of water cycling had been greatly increased due to ‘increasing the plants metabolism’. Not to mention the plankton and algae.

    I also recall reading somewhere that records and ecological data from 1800-1900 show about 0.8C warming trend; while 1900-2000 was about 0.4C. They stated a reason of coming out of an ice age from 10,000 years ago, and a mini ice age in the early medieval times. The kick-off of the Industrial revolution was not really in gear until the 1870′s and increasing ever since. Why would the possibly natural temperature increase slow during the increase in CO2, CFC’s and NOx? wouldn’t the current theorys call for an acceleration?

    Why isn’t the pressure for change being placed where it is most needed? On growing third world industrial nations like China, India, Korea, etc. The US was a revolutionary force in automotive clean air acts, scrubbers, toxic cleanup etc… Our pace has slowed as the costs clime the bleeding edge of cost/results. I have worked in the Auto industry all my life and remember the OEM’s blocking of the EGR valves and eliminating the Catalytic converters in cars sold to Canada and Europe all during the 1970′s.

    Another thing that bothers me with news reporting is the thought that somehow Electric cars are a solution. Well letâ??s see: first you generate electricity from (coal, Nuclear or Gas) at(Efficiency losses >30%). Pump it 30 miles to your home through resistive wires (Efficiency losses >40%). Run a battery charger circuit and charge a bank of 16 (Lead acid or Metal Hydroxide-Ecological disasters) at (Efficiency losses >35%). Then discharge the batteries to run an electric motor (Efficiency losses >15%). So basically you need to generate 3.5 times the electricity used to run the car to replace a Gasoline engine with a >58% total efficiency with better emissions then the power plant? (Guesses on efficiency)

    Iâ??m sorry but Public Transportation and Bio-Diesel makes 100 times more sense. I hope a fuel cell that doesnâ??t use 5Kg of Platinum (At 5 times the cost of Gold) and actually burns something other then gasoline or diesel can be developed! How about alcohol as a hydrogen base? I know, I know, too much bacteria methane gas, Why not run them on Cow farts?

    Just a few thoughts,
    Jeff

    Comment by Jeff Simchak — 12 Aug 2005 @ 12:59 AM

  2. Thanks for providing some really interesting background information on the radiosondes! I’m also interested in the correction procedures you mention for the “heat island” effect on surface temps; if you could provide a pointer to an article/book/website providing more detail on those corrections, I’d greatly appreciate it.

    Comment by Armand MacMurray — 12 Aug 2005 @ 1:35 AM

  3. Armand,

    A good review article on this is Peterson et al., Int. J. Climatol., 1998, p 1493.
    There are more recent articles by Tom Peterson and by David Parker putting the homogenized surface record through some tests designed to expose problems (they didn’t find any). Roger Pielke has published several recent papers raising heat-island and land-use related worries as well as microclimate exposure issues and showing the impact that they can have in isolated cases.

    Comment by Steven Sherwood — 12 Aug 2005 @ 10:50 AM

  4. I’d be interested to hear your take on the accuracy of the radiometers produced by Radiometrics. I’m an engineer, not a meteorologist, but it does seem that while tricky to operate, these instruments could overcome some of the radiosonde problems. I notice that they use the radiosondes for a reference too. Although I think it may primarily be just for a comparison basis to prove that their technology works.

    http://www.radiometrics.com/pubs.htm

    Comment by Russ — 12 Aug 2005 @ 11:31 AM

  5. Re: #1 if you go to the library and use ISI, you’ll find a paper using the DMSP to estimate the extent of the UHI, and way back in 1989 you’ll find a paper by Karl that discusses corrections. But an interesting new paper creates, to me, the best thoughts: Jin ML, Dickinson RE, Zhang DL 2005. The footprint of urban areas on global climate as characterized by MODIS. Journ. Clim. 18:10 pp. 1551-1565. An interesting way to go about solving a problem.

    Best,

    D

    Comment by Dano — 12 Aug 2005 @ 11:40 AM

  6. This is really interesting. I was going to respond “tongue-in-cheek” on another site to a contrarian who keeps claiming the antarctic has been cooling (ergo GW is not happening), with, “Well, maybe they used faulty thermometers, especially in the past when thermometer technology wasn’t so advanced.” But, of course, I understood that would also apply to all other places around the world, as well….

    Comment by Lynn Vincentnathan — 12 Aug 2005 @ 11:47 AM

  7. Wow, what a mess of difficulties. It will be very hard, for me at least, to ever have much faith in measurements that require so much correction and adjustment. In a case like this what will happen is these readings will be poked and prodded and nudged until they finally do come in line with model expectations at which point people will stop looking for new biases and new error mechanisms. But this will not be a good indicator that all of the problems have been found, only that everyone is comfortable with the results.

    Thank you for that illuminating article, I don’t know what it will take for me to ever care again what the radiosonde’s say.

    I was wondering if it would not be possible, in a practical sense or even in just a theoretical one, to have enough stations situated in mountain ranges and island volcanoes to get a more reliable view of lapse rates and warming trends (starting now of course) in the lower part of the troposhere. Are the weather dynamics too overwhelming to get any useful readings this way? Is that just too small a layer to be useful?

    Comment by Coby — 12 Aug 2005 @ 11:50 AM

  8. I’m wondering if this article is related to what I just read about the troposphere found to be actually warming & fitting the CC models, now that 2 teams of scientists have corrected for the satellite orbit.

    http://www.climateark.org/articles/reader.asp?linkid=45118

    Comment by Lynn Vincentnathan — 12 Aug 2005 @ 12:03 PM

  9. Re # 6:

    Neither the sonde programs nor the MSU units were designed to detect long-term trends. They are enormously useful in other applications. The design and deployment of these instruments should not be criticized on the grounds that they are not especially useful for purposes for which they were not designed.

    The fact that the NCEP reanalysis is implicitly calibrated to a drifting (biased) instrumental record is something I had not heard discussed previously, though. It seems this should be a matter of some concern in studies of the long-term record.

    Comment by Michael Tobis — 12 Aug 2005 @ 12:55 PM

  10. Geez, it sucks that climatologists can’t hang their hats on at least one line of evidence. I know it only relates to a certain part of the troposphere and there is probably contamination from surface effects, but does this emphasize to a greater extent the importance of tropical glaciers in understanding tropospheric tropospheric trends over the past 100 or so years? Do the new calibrations change the understanding of the tropical glacier data? Here is a quote from another realclimate post (see here http://www.realclimate.org/index.php?p=157#more-157):

    {Kaser et al also argue that surface and mid-tropospheric (Kilimanjaro-height) temperature trends have been weak in the tropics, in “recent decades.” One of the papers cited in support of this is the analysis of weather balloon data by [Gaffen et al, 2000], which covers the period 1960 to 1997. It is true that this study shows a weak (cooling) trend in mid-tropospheric temperatures over the short period from 1979-1997, but what is more important is that the study shows a pronounced mid-tropospheric warming trend of .2 degrees C per decade over the full 1960-1997 period. Moreover, few of the sondes are in the inner tropics, spatial coverage is spotty, and there are questions of instrumental and diurnal sampling errors that may have complicated detection of the trend in the past decade. Analysis of satellite data by [Fu et al, 2004] reveals a tropical mid-tropospheric temperature trend that continues into the post-1979 period, at a rate of about .16 degrees C per decade. When one recalls that tropical temperatures aloft are geographically uniform, this data provides powerful support for the notion that East African glaciers, in common with others, have been subjected to the influences of warming. Set against this is the surface temperature record from the East African Highlands, reported by [Hay et al 2002]. This dataset shows little trend in surface temperature over the location covered, during the 20th century. However, surface temperature is more geographically variable than mid-tropospheric temperature, and is strongly influenced by the diurnal cycle and by soil moisture. The large decadal and local variability of surface temperature may have interfered with the detection of an underlying temperature trend (more “noise” less “signal”). It is unclear whether this estimate of temperature trend is more relevant to Kilimanjaro summit conditions than the sonde and satellite estimate.}

    Comment by Steve Latham — 12 Aug 2005 @ 1:08 PM

  11. re#8

    Neither are surface stations. Homogenisations works nicely in densely monitored areas like europe and US but breaks down in sparse areas. Have a look at the GHCN in the tropics…

    Comment by Hans Erren — 12 Aug 2005 @ 7:51 PM

  12. In response to #1: You should recognize that while the Clean Air Act, catalytic converters, scrubbers, and all that stuff are great for cleaning up conventional air pollutants, they do nothing (or at least very little) to reduce the emissions of greenhouse gases. Unfortunately, the emission of CO2 is an inevitable byproduct of the combustion of fossil fuels (or most any organic matter)…It is not just a product of incomplete combustion (or contaminants in the fuel) like the pollutants like CO, SO2, and NOx. Thus, the only way to reduce the amount of CO2 going into the air is to combust less fossil fuels…or to learn how to sequester the CO2. And, while it is true that the developing nation’s tend to have the highest RATE of growth of greenhouse gas emissions, it is still the developed world…and the U.S. in particular…that have the highest amount of greenhouse gas emissions per capita. We are also responsible for the lion’s share of the rise in CO2 levels that have already occurred, and we have better technology. These are all natural reasons why the developing world would expect us to go first in trying to stabilize and reduce emissions.

    It is true that you have to look at the whole lifecycle of an electric car to be sure it is more efficient. However, it does afford some additional advantages because it allows for the possibility that the electricity could be produced without CO2 emissions (by renewable resources like solar or wind) or that emissions from the electricity production could be more easily sequestered. And, it also makes it easier to have really good anti-pollutant controls at the site of the electricity generation so there are less of the ordinary pollutants. [Another advantage of electric...or hybrid...cars is the ability to easily recapture some of the energy of motion when braking the car and convert it into electricity that you can then use to run it.]

    Note, by the way, that fuel cells are not a magic bullet either as the production of hydrogen also requires energy. So, the issues with fuel cell powered cars and electric cars are actually quite similar.

    Comment by Joel Shore — 12 Aug 2005 @ 10:55 PM

  13. Steve,

    May I disagree with the conclusion? There are and were problems with all kinds of temperature records, as good as for satellite data as for radiosonde and surface data. Thus even if the satellite data now are corrected and are more in line with the expectations of the models, one need to see if the discrepancy which is left over is not based on problems with the surface data.
    To give you an idea, just look at any GISS surface data series around the equator (where the largest discrepancy was found):
    Look e.g. at the data for Salvador, a town of 1.5 million inhabitants. That should be compared with rural stations to correct for urban heat island effect. But the nearest rural stations are 458-542 km away from Salvador (Caetite, Caravela, Remanso). And their data are so spurious, that it is impossible to deduct any trend from them. Quixeramobin is the nearest rural station with more or less reliable data over a longer time span, and shows very different trends than Salvador. Or look at Kinshasha (what a mess!) with 1.3 million inhabitants, or Brazzaville (opposite the Congo stream), and something rural in the neighbourhood (Mouyondzi – 173 km, M’Pouya – 215 km, Djambala – 219 km,…). East Africa is not better: compare the “trends” of Nairobi with these of Narok, Makindu, Kisumu, Garissa,…
    Rural data trends with some reliability on a longer time span are very rare in the whole tropics. Only fast expanding towns have (sometimes) longer data sets which are hardly correctable. The unreliability of the data in the tropic range is thus obvious, that one can wonder how a global surface temperature trend can be calculated to any accuracy…

    Comment by Ferdinand Engelbeen — 13 Aug 2005 @ 8:56 AM

  14. Jeff Simchak: I anticipated some of your objections in a piece just a few days ago. Click my name and the hotlink will take you straight to my brief analysis.

    Comment by Engineer-Poet — 13 Aug 2005 @ 2:32 PM

  15. This discussion is blowing smoke. This .org is doing a poor job of presenting science. Without data tables and figures no one can analyze the data – if in fact there are data. One comment makes sense – comment 13. The surface data for the tropics look pretty unreliable. Do you have data to prove otherwise?

    And back to smoke. There have been a number of articles about the role of smoke in heating the layer below the tropical inversion. Inversions complicate the analysis of the lapse rate, especially when smoke is added. It isn’t simple thermodynamics. The assertion that an undergrad would understand the problem is an attempt to intimidate the non-scientist and bully those who have different perspectives.

    There are a number of strong lines of evidence of “global warming” that I don’t dispute. For example, the sea ice data and ocean temperature data are looking more and more convincing with time. However, asserting that most climate scientists don’t think that there are problems with the surface data for the tropics is not a scientific argument. It’s an opinion. If you want to convince people, try using science.

    [Response:I would point out that if you look at the combined ocean and land data for the tropics (available at the GISS web site), the ocean (still part of the surface after all) shows significant and widespread warming. Since the ocean is actually the majority of the surface in this region, problems with the (admitedly less than perfect) land stations and contintental aerosol effects are secondary issues. Aersols, since they absorb as well as reflect, act to warm the atmosphere with respect to the surface though, and should therefore push the system in the 'wrong' way for your arguement. -gavin]

    Comment by George — 14 Aug 2005 @ 5:57 AM

  16. Reply to Ferdinand EngelBeen #

    You have written in #!3:

    â??There are and were problems with all kinds of temperature records, as good as for satellite data as for radiosonde and surface data. Thus even if the satellite data now are corrected and are more in line with the expectations of the models, one need to see if the discrepancy which is left over is not based on problems with the surface data.â??

    To my understanding the discussion about the surface temperature has been settled long time ago.

    It is therefore only a matter reading RealClimate:

    http://www.brighton73.freeserve.co.uk/gw/temperature.htm#urbanheatislands

    or reading Tom Rees:

    http://www.brighton73.freeserve.co.uk/gw/temperature.htm#urbanheatislands

    When you have a very large dataset, it is possible by cherry picking to find out layers to indicate â??there is something rottenâ??. Using this method you are to my understanding indicating that there has been an undetected â??gross errorâ?? in the methods used to calculate the surface temperature and the statisticians not done a good job.

    From a theoretical point of view I could be the case. However it is unlikely since so many resources have been devoted to analyse the temperature development and so much have been published on this subject.

    I am sure the RealClimate will be able to provide a comprehensive list of reference.

    It is likely that this subject and since this subject will come up again and again, and therefore there is a need for a presentation of this subject for 1) Journalist 2) Laymen 3) Scientists 4) Statisticians.

    Comment by Klaus Flemloese, Denmark — 14 Aug 2005 @ 6:54 AM

  17. re: 15
    Klaus, the main problem with sparse surface data is inhomogeneity. You can’t solve that when you don’t have neighbours to compare with. I know that UHI is not an issue in US and Europe, because these are also the regions where sat and surf agree best. It’s the rest of the world where the problems (oops, challenges) are.

    Comment by Hans Erren — 14 Aug 2005 @ 5:32 PM

  18. Klaus, re #15:

    I am sure that the UHI problem is largely resolved in developed countries, as there are a lot of rural stations which can be used to compensate for the UHI of large towns (there are some residual individual and regional problems, like irrigation in valleys, but that doesn’t influence the general trend that much). The problems arise in less developed countries, especially in the tropics, where the largest discrepancy was seen. In near all of these countries, there simply are very few reliable rural stations, mostly more or less reliable measurements in fast expanding towns.
    No statistician is able to make something reliable from unreliable data.

    A little challenge for you: just count the number of rural stations in the vicinity of urban stations in the 20N-20S (or 30N-30S) band that produce something useful in the 1979-2005 period of interest…

    Comment by Ferdinand Engelbeen — 14 Aug 2005 @ 5:41 PM

  19. Ferdinand, the long-term trends from urban stations aren’t used to create the gridded dataset (only annual-scale fluctuations). All the long term trends are from rural stations. What is happening to the trends from stations identified as urban is irrelevant for this discussion. For more information on how the UHI effect is removed from the GISS analysis, see Hansen et al, 2001

    Therefore, the only questions are: ‘Are the rural stations correctly identified?’, and ‘Are there other, systematic errors in the rural trends?’. There is good evidence that the answer to both these question is no: (The insensitivy of the results to methodology of selecting rural stations, the Parker et al windy days study, and the fact that data from satellite skin surface measurements, from sea surface temperatures, deep ocean temps as we as tropospheric temps are all in good agreement).

    Comment by Tom Rees — 15 Aug 2005 @ 6:33 AM

  20. Tom re #19:

    As Hansen indeed only used rural stations for his global temperature trend outside the USA, I need to change the challenge: find out the station density of rural stations in the GISS database for the tropics (20N-20S or 30N-30S) where in the 1979-2005 period the data show some reliability… Good luck with that!

    Comment by Ferdinand Engelbeen — 15 Aug 2005 @ 11:11 AM

  21. Just to respond briefly to a few of the comments:

    As Gavin points out, the Tropics are mainly ocean so it is the ocean data, not the land surface data that mainly determine the trend we are talking about there. That said, the independent ocean and land data show roughly consistent warming rates. I did not say anywhere in this piece that the land data were free of problems, or that scientists thought they were, only that most have concluded they are in significantly better shape than the other observations. That conclusion is supported by tests applied to each dataset (see my earlier posting) and thus has a scientific basis.

    The role of smoke (#15) was an obvious candidate for explaining purported lapse-rate changes. However, published impacts are localized, and model calculations do not show a significant effect on lapse rates averaged over the whole tropics. I have a student looking at the possible indirect effects of aerosol on lapse rates. Believe me, I am quite prepared to believe the models are missing something and that is what got me into this area of research in the first place. We are also continuing to look more closely at the radiosonde data to see if we will be able to find evidence for interesting (though perhaps less dramatic than before) lapse-rate changes.

    My comment on thermodynamics was intended to counter what seems to be a prevailing notion that global climate models are so complicated we don’t understand anything they do. In many cases this is true, but some results (like lapse rate) derive from simple physics built into the models (this doesn’t mean it’s correct, but means the implications are greater if it is wrong).

    Comment by Steven Sherwood — 15 Aug 2005 @ 2:51 PM

  22. Reply to Gavin RE:response to comment 15. I agree that the surface in the tropics has warmed. As I wrote, sea surface temperature data show warming. You misunderstood my position and knocked down a straw man.

    My point about smoke concerns the lapse rate. The surface layer below the inversion is warming, in part, because smoke is trapped below (and in) the inversion. This warming below the inversion may be increasing the lapse rate.

    My criticism concerns the original post which states, “In the tropics, simple thermodynamics (as covered in many undergraduate meteorology courses) dictates that it should actually warm faster, up to about 1.8 times faster by the time you get to 12 km or so; at higher latitudes this ratio is affected by other factors and decreases, but does not fall very far below 1. These theoretical expectations are echoed by all numerical climate models regardless of whether the surface temperature changes as part of a natural fluctuation, increased solar heating, or increased opacity of greenhouse gases.”

    I disagree with aspects of this statement because it does not consider the effects of inversions and the complex processes involving water vapor. These processes affect the lapse rate. Moreover, I don’t think that the tone of that paragraph contributes to the purported educational mission of your group because it implies that those who disagree don’t understand elementary thermodynamics. Perhaps I misspeak. That paragraph is so convoluted that it’s easily misunderstood.

    Models can’t be improved if they aren’t critically assessed. It isn’t simple thermodynamics. Emanuel, who has literally written the book on convection, states “But the physics of the processes controlling water vapor in the atmosphere are poorly understood and even more poorly represented in climate models, and what actually happens in the atmosphere is largely unknown because of poor measurements. It is now widely recognized that improvements in understanding and predicting climate hinge largely on a better understanding of the processes controlling atmospheric water vapor. ” http://wind.mit.edu/~emanuel/anthro.html

    In conclusion, the research on radiosonde measurement problems looks promising but it is only a small part of a larger problem of poor measurements and poor models.

    Aloha,

    [Response: The predicted/theoretocal lapse rate changes do include water vapour condensation processes (which is why it is different from the dry adiabat of course), and as show in the Santer et al paper, all data and models agree that this works well for short term (monthly to interannual) variability. It is conceivable that aerosol effects (which includes 'smoke') could also affect the lapse rate, but the aerosols tend to warm where they are located and depending on the composition, cool below - this gives an impact that - if it was a large factor in the tropical mean - would produce changes even larger than predicted from the moist adiabatic theory. This would make the S+C numbers even further off. Note too that the models do include representations of aerosol changes over this period - though imperfectly. Deciding on whether the models are 'poor' however, depends upon how much trust can be put on details of the data - and as we have seen, there are a number of still outstanding issues (RSS vs. S+C v5.2 for instance) that mean that these data do not lend support to the idea that the models are poor. - gavin]

    Comment by George — 15 Aug 2005 @ 6:47 PM

  23. #6 antarctic has been cooling (ergo GW is not happening) by Lynn Vincentnathan

    Peter Doran’s research (no relation) shows warming and cooling.

    I think I can explain. Oceans that are warmer are more conductive–about a percent more conductive per each degF. Problem is that impedence is not just about resistance but also about induction. And capacitive couplings can occur better with warmer, more conductive ‘plates’ that the oceans present. So you have on the one hand warrming oceans and on the other hand high pressures building around Antarctica preventing surface lows from bringing warmer conditions inside Antarctica. You have more intense capacitive couplings in some places impacting microphysics and less intense in others, depending on the ocean currents and the induction meaning they hold.

    Comment by Mike Doran — 16 Aug 2005 @ 2:17 AM

  24. Here is another vertical scale issue to ponder. Climate reconstructions based on borehole and ocean sediments (Moburg) are lower over the past few hundred years than reconstructions based on surface proxys.

    Comment by Eli Rabett — 16 Aug 2005 @ 2:01 PM

  25. Three new articles I have seen today:

    http://www.nature.com/news/2005/050808/full/050808-13.html

    http://www.nytimes.com/2005/08/12/science/earth/12climate.long.html?ex=1281499200&en=2588a631b8c5cc5d&ei=5090&partner=rssuserland&emc=rss

    http://www.sciencedaily.com/releases/2005/08/050814164823.htm

    Comment by Stephen Berg — 16 Aug 2005 @ 11:21 PM

  26. I forgot to enclose the link to the Grist Magazine blurb on the articles:

    http://www.grist.org/news/daily/2005/08/12/3/index.html

    Comment by Stephen Berg — 16 Aug 2005 @ 11:25 PM

  27. RE #1 & #12 on electric vehicles. From what I’ve heard from people who convert them & read in books, even if the source of electricity is coal or oil, EVs are still 1/4 to 1/3 more efficient (even figuring in batteries & their manufacture) than ICE vehicles. And, of course, if your electricity is wind-generated (which is available in many states for a bit more, or even less, as in Texas), then GHGs for transportation go way way down. Maintenance for EVs is also much easier & cheaper, and less frequent.

    I’m just waiting for plug-in hybrids (with a range of 10 or more miles) to come out, then on 95% of my driving days I can run the car strickly on wind power. People can also convert ICE cars to electric, and I understand it’s not too difficult; there are EV clubs around the nation that can help.

    Comment by Lynn Vincentnathan — 18 Aug 2005 @ 1:50 PM

  28. re: #19

    I have some bad experience with the automated way GISS corrects for urban trends and inhomogeneities.

    GISS doesn’t detect jumps, and adds warming trends to rural stations.
    http://home.casema.nl/errenwijlens/co2/homogen.htm
    in particular these two graphs:
    http://home.casema.nl/errenwijlens/co2/debilthomogenisations.gif
    http://home.casema.nl/errenwijlens/co2/ucclehomogenisations.gif

    Comment by Hans Erren — 19 Aug 2005 @ 5:15 PM

  29. Reply to Gavin RE:response to comment 22.

    A satisfactory response to the short term but failing in the long term is a classic case of a bad model. It is what neural network scientists call over-fitting. It implies that you have matched your model to fit the current circumstances, but because the logic is wrong, then it does not work when the environment is changed. I have been looking at the GISS ModelE1 and it seems to me that the radiation emitted by each layer is being calculated using Planck’s function for blackbody radiation. Perhaps you could correct me if I am wrong. However if that is so, please note that in the real world the radiation emitted by each layer originates only from the the greenhouse gases, and is not cavity radiation. I believe that this is a hangover from Schwarzschild’s equation, and is a problem with all of the GCMs. In other words, the radiation scheme used in all computer models is wrong! Unbelievable but true.

    HTH,

    Cheers, Alastair.

    [Response: Unbelievable and untrue, actually. I suggest you read a relevant text (Houghton "Physics of Atmospheres" is good on this - chapter 4). The Planck function is used, but it is multiplied by a wavelength dependent function so that you only integrate over the approximation to the lines. -gavin]

    Comment by Alastair McDonald — 24 Aug 2005 @ 7:58 AM

  30. Has this study considered the effect of oceanic circulation cycles such as the 50-70 year Atlantic Multidecadal Oscillation? A century ago might have been during a relativley cool point in the cycle compared to now. The AMO has been studied going back at least 500 years.

    It is a little dangerous to project a trend from two points on a sine wave, especially when the measurements of the two points are subject to “error correction.”

    Comment by Norm Stephens — 26 Aug 2005 @ 9:24 AM

  31. RE the back & forth about models, & that they are only best for short term predictions….it seems to me CC scientists are doing a remarkable job in taking on the whole earth-solar system with its kazillion variables (maybe even some we don’t know about yet) & making models out of it. The fact that the models have so many variables now that fit fairly nicely what real world data we have is a testament to very hard work over many years (also on the part of the folks who develop powerful computers). I’m simply amazed. My economics professor the 1st day of class threw a balsa model airplane, & it flew, then crashed. He said economics provides models for the real world, and they work pretty good, but they are not the real world.

    Don’t burn out. The smoke generated from burn out may be harmful to your health & the world’s health. (Or has this analogy/model already crashed?)

    Comment by Lynn Vincentnathan — 26 Aug 2005 @ 11:58 AM

  32. I think most scientists would now agree that the climate is truly warming however the cause is still in doubt. If in the end man is the cause, it will be too late by then to do anything about it. I feel we have no choice but to assume mankind is the fundemental cause and start taking the proper steps to control the problem. I think first and foremeost concern should be the world’s growing population. Stop the increase in people and you will stop the warming of the planet to a large degree.

    Comment by extagen extenze — 26 Aug 2005 @ 4:08 PM

  33. Why is it that we worry about temperature in 2100?
    The effects in 2100 are caused by emissions in 2080.
    Everybody in this forum will be dead by then, and also their children.

    Reminds me of the worry for horse manure in the cities at the beginning of the 20th century.

    In 50 years people have other worries, and we don’t need to worry about them.

    Comment by Hans Erren — 27 Aug 2005 @ 5:35 AM

  34. Living many years at the margin of old logging, more recent logging, a ridgeline overlooking an alluvial valley where a small city is growing and similar cities extend to the horizon in inland coastal range mountains in CA, I am reminded often of the coolness in the tall forest contrasting to the heat generated on sunny days where forests are discontinuous because of logging or absent because of urban growth.
    Next study desertification.
    It would be interesting to describe the thermodynamics of treeshade.
    It is because I have lived in this place so long and recognize the patent changes and link to forest condition that I keep returning to this amateur hypothesis.
    Now for the research.

    Comment by JohnLopresti — 27 Aug 2005 @ 7:34 PM

  35. The faith that UHI effect has been adequately compensated does not take into effect what Pielke Sr. refers to as land use change effects, that have also tended to raise rural readings. Long term changes in the way sea surface temperatures are measured also tend to introduce warming. In fact all long term surface instrument changes are in a direction to introduce warming as pointed out years ago by Daly. Global averaging of stations has also not been compensated for dropouts which have reduced the total number of reporting stations dranatically since 1989. If all effects were accounted for the surface temperature would undoubtedly prove to be affected by much more than 0.05 degrees.
    The url referenced in #16 makes several statements that are simply not supportable over the course of a century, and gives figures for % of rural and urban stations as if their ratio was fixed, which it assuredly has not been. It is good that the sat. and sonde readings are being corrected. Now we need a real effort to also correct the surface instrument averages. Murray

    Comment by Murray duffin — 27 Aug 2005 @ 8:29 PM

  36. Re #33: Well, Hans, that explains a lot. Of course your point of view ignores the fact that the effects of excess atmospheric CO2 last for considerably more than 20 years, and assumes that climate “tipping points” (such as we may now be seeing in the Arctic) can’t possibly be a problem for us (“us” being the privileged residents of North America and Western Europe).

    Re #34: In case you don’t know about it, Google Scholar at http://scholar.google.com/ is an excellent on-line resource for this type of research. Many of the studies are subscriber-only, but even in that case at least the abstracts can be seen.

    Comment by Steve Bloom — 27 Aug 2005 @ 8:51 PM

  37. Hmm… it seems that Hans Erren has adopted the French king strategy, apres moi le deluge. This tends to end badly, cf 1790. OTOH, we do have the moral issues of leaving a place no worse than we found it and since I intend to live forever, I guess that makes me even more interested in the issue.

    Comment by Eli Rabett — 27 Aug 2005 @ 9:16 PM

  38. Re #31 from Lynn, where she congratulates the climate modellers on a job well done. I thought her story of the model plane was most apposite. The sciences of economics and earth science share a feature which makes them stand out from most of the other physical sciences. In neither science is it possible to set up experiments to test theories. Thus one can propose hypotheses, and create models, but the only way to test them is to wait and see what happens.

    We know from the ice cores and other paleological evidence that the climate changed abruptly in the past, and in the not so distant past as well. Yet the current climate models cannot reproduce those events. It is all very well admiring the well constructed model plane, and the complications of the climate models, but if after a short period of simulating the real thing, then they crash, one cannot really call them good models. Of course, in this case it is the climate system itself which will crash and not the model, which would still be predicting a smooth transistion to a warmer world.

    The climate models use a technique for calculating the greenhouse effect that predates quantum mechanics, and proper peer review. These days science is thought to progress through small steps, each thoroughly checked. That is fine, except when a mistake has been made in the past and not noticed. That is what has happened here, and so the progress in the climate models has now halted. For 15 years the prediction of warming resulting from a doubling of CO2 has varied by 300% from 1.5 to 4.5 K. For 15 years the climate modellers have been claiming it will take them 15 years to get the clouds and aerosols right.

    Of course you may claim that the weather men do produce better forecasts, but that is because they have learnt where the models go wrong, and can adjust their forecast appropriately. The climate models cannot even predict the height of the cloud base correctly, but the weather men know how much to add to the model value depending on the time of day.

    Don’t be fooled by the dulcet tones of the sirens used by the oil and coal industries to lure us onto the rocks. What we need is a return to a world where the hand that rocks the cradle rules the world!

    Comment by Alastair McDonald — 28 Aug 2005 @ 6:38 AM

  39. Re# 35

    Indeed the Bern model assumes a saturation of the sinks, whereas observations show an ever rising sink increase. No wonder they calculate 1200 ppm for 2100!

    http://home.casema.nl/errenwijlens/co2/sink.htm

    The observed half life for CO2 in the atmosphere is 55 years. If you want to rely on the SRES models that assume absurd CO2 emissions (A2 and A1FI), fine, but don’t build a policy on it for the next fifty years.

    Comment by Hans Erren — 28 Aug 2005 @ 11:32 AM

  40. Hi Hans (#39),

    I guess you disagree with what D Archer posted on RealClimate a while ago (http://www.realclimate.org/index.php?p=134):

    “When you release a slug of new CO2 into the atmosphere, dissolution in the ocean gets rid of about three quarters of it, more or less, depending on how much is released. The rest has to await neutralization by reaction with CaCO3 or igneous rocks on land and in the ocean [2-6]. These rock reactions also restore the pH of the ocean from the CO2 acid spike. My model indicates that about 7% of carbon released today will still be in the atmosphere in 100,000 years [7]. I calculate a mean lifetime, from the sum of all the processes, of about 30,000 years. That’s a deceptive number, because it is so strongly influenced by the immense longevity of that long tail. If one is forced to simplify reality into a single number for popular discussion, several hundred years is a sensible number to choose, because it tells three-quarters of the story, and the part of the story which applies to our own lifetimes.

    However, the long tail is a lot of baby to throw out in the name of bath-time simplicity. Major ice sheets, in particular in Greenland [8], ocean methane clathrate deposits [9], and future evolution of glacial/interglacial cycles [10] might be affected by that long tail. A better shorthand for public discussion might be that CO2 sticks around for hundreds of years, plus 25% that sticks around forever.”

    Questions: 1) From where do you get 55 years and do you believe that estimate to be more defensible? 2) Why do you not agree with theory that saturation should increase in the future? 3) What’s going on in 1998 for the link you posted (thanks for that) — is it that the warm ocean surface in that year would not absorb much CO2?

    And a note for Eli (#37): I also plan to live forever — so far so good!

    Comment by Steve Latham — 28 Aug 2005 @ 5:22 PM

  41. Re #39:

    So, how do you get from “The observed half life for CO2 in the atmosphere is 55 years” to “Why is it that we worry about temperature in 2100? The effects in 2100 are caused by emissions in 2080.” ? Even if you accept the 55yr time constant, this is clearly wrong, and gets worse when you consider the thermal lags in the system.

    Re #40:

    Evidently the source of the 55yr estimate is the loony Dietze model. There’s an email dialog on Dietze’s web site between him and some real carbon cycle modelers (Goudriaan and Joos for example). It reads like the Monte Python dead parrot routine – Dietze is simply ineducable. It’s hilarious how he makes complex assertions about problems with the representation of the vertical mixing structure etc. in other models, based on what appears to be a 1st order box model.

    I didn’t succeed in finding a definitive set of Dietze’s actual dynamic equations for atmospheric CO2 on the Daly web site; two different models seem to be implied. But the origin of the 55yrs appears to be a single, static, linear calculation of the time constant via Little’s Law: The Lifetime of CO2. That’s a pretty cavalier attitude to fitting the data especially given that much richer information is available, e.g. bomb isotopes, which the model would fail to fit. Even if you accept the assertion of linearity, the maximum likelihood estimate of the time constant for a 1st order model is much longer; I’ve tried it (as did Nordhaus, who arrived at 120 years using OLS, which isn’t really right, but leaves 55yrs way out in the cold).

    Comment by Tom Fiddaman — 29 Aug 2005 @ 1:58 PM

  42. Re #40 and #41 Long input:
    From http://cdiac.esd.ornl.gov/pns/faq.html
    snip Q. How long does it take for the oceans and terrestrial biosphere to take up carbon after it is burned?
    A. For a single molecule of CO2 released from the burning of a pound of carbon, say from burning coal, the time required is 3-4 years. This estimate is based on the carbon mass in the atmosphere and up take rates for the oceans and terrestrial biosphere. Model estimates for the atmospheric lifetime of a large pulse of CO2 has been estimated to be 50-200 years (i.e., the time required for a large injection to be completely dampened from the atmosphere). Snip
    This range seems to be an actual range depending on time frame, rather than the uncertainty among models. [See below].

    See http://cdiac.esd.ornl.gov/ftp/ndp030/global.1751_2002.ems
    For the 5 decades from 1953 through 2003, we have now had 4, 3, 2, 1, and 0 half lives respectively, using an average half life of 11 years, (based on real 14C measurement). We get a total remaining injection in 2004 from the prior 5 decades of 139 Gt, which equates to an increase in atmospheric concentration of 66 ppm. The actual increase from 1954 to 2004 was very near 63 ppm. This result lends some credibility to the 50 year atmospheric residence time estimate above. A 200 year residence time gives an 81 ppm delta since 1954, which is much too high.
    Surprisingly, if we go all the way back to 1750 and compute the residence time using fuel emissions only we get a value very close to 200 years. (A 40 year ½ life gives a ppm delta of 99 vs an actual of 96 using 280 ppm as the correct value in 1750). If we assume that terrestrial uptake closely matches land use emissions, (this is essentially the IPCC assumption), and we know that the airborne fraction from 1964 through 2003 had a weighted average of 58%, to shift to a long term 40 year ½ life from a near term 11 year ½ life, we would have to have prior 40 year period weighted average airborne fractions like 80% for ’24-’63, and near 90% before that. As the airborne fraction has been steadily dropping, this may be realistic. Since emissions in the last 40 years have been 3 times higher than in the period from 1924 to 1963 and 30 times higher than 1844 to 1883 it is not too hard to believe that the rapid growth in atmospheric partial pressure has forced such a change in airborne fraction

    From Archer, I think chapter 9 snip
    We expect that added CO2 will partition itself between the atmosphere and the ocean in proportion to the sizes of the reservoirs, and in the ocean we expect that size to be the buffering capacity. The relative sizes of the preanthropogenic atmosphere and the atmosphere plus ocean buffer are proportioned 560:(560+2500) equals ~18%. This crudely
    predicted atmospheric fraction is comparable to the model atmospheric
    fraction after 1000 years, which ranges from 14-30%, depending on
    the size of the fossil fuel release. Snip

    And also snip
    The bottom line is that about 15-30% of the CO2 released by burning fossil fuel will still be in the atmosphere in 1000 years,snip

    I have been trying to figure out what this meant, apart from the
    obvious errors. The errors are:
    a)that the surface ocean buffer circa 1994 is given by the IPCC as
    1020 Gt, which would give a preanthropogenic value of 900 Gt, not
    2500

    b)the value of the ratio is then 38% not 18%.

    c)These values are inventories or stocks, not reservoirs. The
    reservoirs are vastly larger. I’ll admit this last one is a quibble. I know what he meant — I think.

    Probably the partitioning he wanted is among the atmosphere, the
    terrestrial “reservoir” and the surface ocean buffer, which would be
    560/(560+900+2190)= 15%, which is still just within his range of 14-
    30%.
    The question is, “What does this mean?”
    Consider that we emit a pulse of 230 Gt from 1954 through 2003, of
    which about 130 Gt is retained in the atmosphere. Then we stop C
    emissions. We will get back to our 1954 starting level of about 320
    ppm in between 50 and 200 years, but probably closer to 50 years.
    The pulse has then disappeared.
    However, what if all of the molecules of our pulse are colored
    green, like tennis balls, while the original 320 ppm are colored
    white, and like tennis balls the color has no effect at all on their
    behaviour? Then when they have partitioned themselves according to
    the original distribution, we will still have 15% of the green
    molecules in the atmosphere, and these will only disappear over the
    longer time that it takes for mixing with the deep ocean and
    permanent uptake in the terrestrial sink, possibly more than 1000
    years. That for sure gives a long residence time for the green
    molecules, but it doesn’t lengthen the residence time of
    the “pulse”. I can’t think of any other explanation.

    [Ad homs deleted - gavin]

    Murray

    Comment by Murray duffin — 29 Aug 2005 @ 2:43 PM

  43. This is a bit off-topic, but I just read about a new model for the end-Permian extinction at:

    http://www.ens-newswire.com/ens/aug2005/2005-08-29-01.asp

    I’m interested in this, even though I understand we are not very likely to reach a tipping point in this century which might lead to such a runaway GW scenario, but it motivates me all the more to reduce my GHGs.

    And I think #39 is a bit flippant (if I understand him correctly) about not being concerned about the future. I’m even concerned about that CO2 tail 100,000 years from now. The idea that my GHG emissions of today might be harming people & other life forms even far into the future is as disturbing to me as harming people living today.

    Comment by Lynn Vincentnathan — 29 Aug 2005 @ 2:51 PM

  44. My take from the observations is that each year 1.6% of the excess CO2 over 280 ppm (the equilibrium) will be absorbed, predominantly as straighforward diffusion. Increasing CO2 in the atmosphere means therefore an increasing sink. There is no distinction between “old” and “new” CO2, just atmospheric concentration. Compare it to a leaky bicycle tyre: the higher the pressure, the faster the flow.

    Yes, the flow out of the atmosphere is modulated by temperature, which is similar to pinching the leak, gives a beautiful graph btw:
    http://home.casema.nl/errenwijlens/co2/co2lt_en.gif

    Comment by Hans Erren — 29 Aug 2005 @ 5:40 PM

  45. I agree with #44 re old and new completely, but it seems Archer makes the distinction, as I tried to illustrate. Murray

    Comment by Murray duffin — 29 Aug 2005 @ 10:54 PM

  46. Re #42:

    The point I should have made in #41 is that back-of-the-envelope calculations that imply 1st order models of the type dCO2 = a*E – (CO2-CO2(0))/tau are not well constrained by the Mauna Loa atmospheric CO2 data. You can have any tau you want between at least 40 and 400 years and still get a good fit by varying a. It’s even worse if you cherry-pick tau by making calculations with a favorable year’s flux and stock imbalance (as Dietze does) or ignore all time series information by making calculations aggregated over a century.

    A real carbon cycle model needs to conform to physical laws (e.g. carbon chemistry in the ocean, conservation of carbon), be robust in extreme conditions, and fit all available data (e.g. isotopes & ice cores). Linear box models, while conceptually useful, fail several of those tests right away. The skeptic models above don’t even meet the minimal requirement of writing down all your equations in one place in a standard format and demonstrating how they fit to data with appropriate statistics.

    What the skeptics have really done so far is use simple models to observe the “missing sink,” which other carbon cycle modelers discovered and named years ago (and more or less attributed to NH terrestrial biosphere). The skeptics’ original contribution is to attribute the missing sink to ocean uptake, which unfortunately violates what’s known about the ocean carbon budget. If they want to be taken seriously, the burden is on them to get some data beyond the Mauna Loa CO2 record that supports their position, and address the data that refutes it. To refute Bern etc., they need to find an actual problem, either conceptual or fit to data; constructing an alternate hypothesis with arbitrary parameterization and limited data isn’t sufficient.

    Also, to pass the snicker test, skeptics (particularly Dietze) need to give up the pretense that linear impulse response is the be-all-end-all and stop making silly assertions about Bern and other models that confuse model structure with rough characterizations of behavior for purposes of discourse. A good start would be to actually replicate some of the classic models (e.g. Oeschger) in transparent simulation software, and then develop, share, and preferably publish credible alternatives meeting the tests above.

    Comment by Tom Fiddaman — 30 Aug 2005 @ 11:45 AM

  47. RE #44 & 45, I hope you’re not making the contrarian argument that whatever GHGs humans emit are aborbed into nature, and it is only nature’s GHGs that are up there in the atmosphere, or that somehow human emissions are absorbed first, and nature’s emissions last. So blame it on nature.

    If so, I think we have to look at the marginal effect, or what would be the concentration of GHGs in the atmosphere, if humans had not started emitting so much over the past 150 years, then compare that with the situation today. And then figure the overall effects on the world of the “before” situation & compare with what is & will happen “extra” with the human emissions. As I’ve mentioned before it is the last few inches of flood, or dryness of drought, or degree of heat, or intensity of storm that does much more damage than the first few increments. Of course, if everything is destroyed & all people killed in a community, then you get a flat line, while the storm, etc. might be raging more & more fiercely.

    Comment by Lynn Vincentnathan — 30 Aug 2005 @ 11:46 AM

  48. Re: #47,

    “I hope you’re not making the contrarian argument that whatever GHGs humans emit are aborbed into nature, and it is only nature’s GHGs that are up there in the atmosphere, or that somehow human emissions are absorbed first, and nature’s emissions last. So blame it on nature.”

    Here’s an article which indicates that soon, the nature will not be a net absorber (if it even is at the moment), but will be a net emitter of GHGs.

    “Warming hits ‘tipping point’”:

    http://www.guardian.co.uk/climatechange/story/0,12374,1546824,00.html

    Comment by Stephen Berg — 31 Aug 2005 @ 2:21 PM

  49. re: #48, now why didn’t this runaway happen in the Holocene climate optimum, or the eemian interglacial when temperatures were significantly higher?

    Comment by Hans Erren — 31 Aug 2005 @ 4:47 PM

  50. re 48:
    No absorption doesn’t distinguish between sources, 1.6% of all excess CO2 in the atamosphere is absorbed annually.

    And I just don’t like to extrapolate uncertainties, something I learned in exploration geophysics the hard way. IMHO Al climate “forecasts” should limit themselves to 40 years as an absolute maximum.

    [Response:plug removed - WMC]

    Comment by Hans Erren — 31 Aug 2005 @ 4:56 PM

  51. RE #50, a 40 year forecast may be all right for scientists, but for laypersons living in the world and concerned about it (even into the far distant future, as it relates to today’s activities), we want much more & are willing to sacrifice the sacred cow of .05 significance. This makes for a creative tension between scientists (who are doing their best) and concerned laypersons (who want more information — and quite frankly re GW, we have more than enough info now on which to base our actions).

    There should not, however, be this additional tension between scientists and selfish-motivated contrarians, who care little for life on earth beyond their own lives (&, it seems, don’t even want to reap savings & other benefits from efficiency and conservation). The .05 significance level is caution enough to protect scientific reputations and industrial profits. Protecting life (even into the distant future), I think, is a higher, more noble goal & should come before a life of exteme inefficiencies, excesses, and thoughtlessness that entails harm to others and oneself.

    From what I’ve seen, the “climateaudit.org” site is a contrarian site (not merely a science skeptic site) which falls short of bonafide science. So I would suggest you stick to this webpage & avoid that one.

    Comment by Lynn Vincentnathan — 1 Sep 2005 @ 11:53 AM

  52. Sydney, Australia, has just had its warmest winter on record, with average maximum temperature 1.6 C above the long term average and average mean temperature about 1 C above. The last 5 years are the top 6 warmest years (one year was shared with 1988). What are the chances of that being purely by chance or by some natural cycle (as opposed to GW)? Is there anyway to put a figure on these? There may be some heat island effect, but the figures were similar at a number of quite different sites across a big city.

    Also it has been very dry and the question is, is the drought a result of the rising temperatures or the temperatures the result (partially at least) of the drought?

    Comment by Bob Moore — 2 Sep 2005 @ 4:11 AM

  53. Re # 46 I am not a modeller so can only comment on the available data. Again, assuming terrestrial uptake closely matches land use emissions, and knowing that as anthropogenic emissions rise the atmospheric fraction declines, the only place for the missing C to go is ocean uptake. The present ocean uptake is then 2x the rate presented by the IPCC in their illustration of C sources and sinks, and the ocean sink is increasing as noted by Hans Erren, regardless of what the ocean “budget” is assumed to be. I suspect there are major unknowns relative to ocean uptake of C. Murray

    Comment by Murray duffin — 2 Sep 2005 @ 7:44 PM

  54. Further on # 46
    I have tried for a day now to find consistently conclusive agreement on isotopic reconstructions, and failed, so I doubt if a model can be made to “fit all available data”. Also reviewed my past collection on ice cores. Law Dome takes about 68 years to close. Siple takes about 105 years to close. Vostok takes from 4000 years in warm periods to 6000 years in cold periods to close. therefore any data is a 70 year moving average for Law, 100+ yr for Siple and 4000-6000 for Vostok. Siple shows CO2 concentrations of 390 ppm at about 140k yrs bp, at about TerminationII. This is a very inconvenient number, and rather than being accepted as real is explained as evidence that CO2 has been imported from somewhere. Vostok shows 290 PPM for the same period, but with 4000 years of smearing a peak to 390 ppm lasting a few hundred years would disappear. Quite apart from probable losses of pressurized CO2 during coring, storage and preparation for analysis, this data shows us that ice cores are not a completely reliable proxy. Numerous studies show us that tree rings are not even a useful proxy for historic temperature, so the whole issue of building models that “fit all the data” is rather fraught. The data we do have clearly show that the models used for C modelling in the TAR are already very wrong. Have you tried comparing model projections from ca 1999/2000 of surface temperature for the early part of the 21st century with the published record of the last 7 years? The only ones I can find published on the web provide another “snicker test” Murray

    Comment by Murray duffin — 4 Sep 2005 @ 12:36 AM

  55. Re #53 & #54

    “The data we do have clearly show that the models used for C modelling in the TAR are already very wrong” I keep seeing this kind of statement with nothing to back it up. The logic seems to run, “IPCC carbon cycle models predict sink saturation, but a linear trend fit to Mauna Loa data doesn’t show it, therefore IPCC models are gravely wrong.” There are problems with this logic.

    First, no one actually demonstrates that the IPCC models don’t fit the Mauna Loa data; in fact graphs on the Daly site show no divergence in model behavior through the present. I don’t know where definitive TAR Bern output is, but archived output from 1990 here fits the data pretty well. Certainly low-order box models with sink saturation and time constants >100yrs fit the Mauna Loa data extremely well. That sink saturation is not evident in Mauna Loa data is not surprising, given that it doesn’t become important in the nonlinear models until later in the century.

    Second, the observations of the airborn fraction are conditioned on “assuming terrestrial uptake closely matches land use emissions,” which is not necessarily a good assumption given the large error bounds (>50%) on land use emissions data.

    Third, “knowing that as anthropogenic emissions rise the atmospheric fraction declines” is a statistically weak observation given the short Mauna Loa record and the high emissions uncertainty. More importantly it neglects the stock flow distinction between emissions and atmospheric concentration and thus has no causal meaning. Consider the reverse – emissions fall tomorrow to 1 ton/year; this statement would lead us to expect a rise in the atmospheric fraction, when in fact it would become a large negative number. Uptake is a function of atmospheric and sink concentrations, not emissions.

    Fourth, the comments above are rather loose with terminology about sink flows and the airborn fraction. Sink saturation doesn’t imply decreasing sink flows. Increasing absolute sink flows reflecting a greater gap between atmospheric and oceanic CO2 don’t refute the notion of saturation. A constant or diminishing airborn fraction also doesn’t necessarily reflect rising sink potential; more likely it reflects the deceleration in emissions since the 70s.

    My comment about isotope ratios pertained to use of bomb C14 and other recent stuff (CFCs) for measuring ocean vertical structure and C inventories, not the very long term records. It’s absurd to suggest that “the ocean sink is increasing as noted by Hans Erren, regardless of what the ocean “budget” is assumed to be” The “budget” refers to measurements with fairly tight confidence bounds compared to the biosphere, not some fanciful target (see The Oceanic Sink for Anthropogenic CO2). And nothing about Hans’ data says anything about where the carbon is going.

    “the whole issue of building models that “fit all the data” is rather fraught” Instead let’s throw out all the inconvenient data and just work with things that have nice linear trends? Setting temperature arguments aside since I was talking about carbon, we should be building models that fit all available data in a broad sense, including C12/C13 ratios, oxygen, spatial patterns, known features of ocean chemistry (Revelle factor), etc. Bern and other models do this; skeptic models don’t even address the issues.

    “Have you tried comparing model projections from ca 1999/2000 of surface temperature for the early part of the 21st century with the published record of the last 7 years? The only ones I can find published on the web provide another “snicker test”" I’d love to see a link for this.

    Comment by Tom Fiddaman — 5 Sep 2005 @ 4:04 PM

  56. “A constant or diminishing airborn fraction also doesn’t necessarily reflect rising sink potential; more likely it reflects the deceleration in emissions since the 70s.”
    Decade 1 2 3 4 5
    Years ’54-63 ’64-’73 ’74-’83 ’84-’93 ’94-`03
    Ave,annual fuel emissions (Gt/yr) 2.4 3.4 5.0 6.0 6.7
    Percent change decade to decade 42 47 20 12
    Ave.annual atmos. conc’n delta (ppm/yr) 0.8 1.1 1.4 1.5 1.8
    Atmos. conc’n delta per Gt emission (ppB) 333 324 280 250 270
    Implied atmospheric retention (Gt) 1.7 2.3 2.9 3.1 3.7
    Airborne fraction (%) 71 68 58 52 55
    Ocean uptake from fuel (Gt) 0.7 1.1 2.1 2.9 3.0
    Deforestation factor (%) guestimate* 1.03 1.06 1.09 1.12 1.15
    Total emissions (Gt) 2.5 3.6 5.5 6.7 7.7
    Airborne fraction of total (%) 68 64 53 46 48
    Ocean uptake total (Gt) 0.8 1.3 2.6 3.6 4.0
    Sorry about the way the table copies. What do you mean by “a deceleration in emissions”? Is this a deceleration in the % rate of increse? what would that prove?
    ) http://www.aoml.noaa.gov/ocd/gcc/co2research
    The key quote from this url is “The global oceanic CO uptake using different wind speed/gas transfer velocity parameterizations differs by a factor of three (Table 1)”. ie 3;1 change depending on model assumptions.
    http://www.hamburger-bildungsserver.de/welcome.phtml?unten=/klima/klimawandel/treibhausgase/carbondioxid/surfaceocean.html
    Here we find a nice description of atmosphere/ocean interchange mechanisms, with the diagram and values like the IPCC equivalent, and with the major fault that it gives the impression that the exchange magnitudes are well known. While this was published sometime after 2001, the net ocean uptake from the atmosphere shown would be roughly correct for about the mid `70s, and has since well more than doubled, (see table above) despite surface warming. This would suggest that a near surface increase in ocean carbon concentration
    considerably upsets the exchange between the surface and deeper ocean waters. It seems possible that carbon fertilization plus warming considerably accelerate growth of ocean biota. The IPCC
    downplay this possibility, but do not outright deny it, which suggests a fairly high degree of probability to me.

    “Second, the observations of the airborn fraction are conditioned on “assuming terrestrial uptake closely matches land use emissions,” which is not necessarily a good assumption given the large error bounds (>50%) on land use emissions data”
    It’s the IPCC assumption.

    Third, “knowing that as anthropogenic emissions rise the atmospheric fraction declines” is a statistically weak observation given the short Mauna Loa record and the high emissions uncertainty.
    See the table above. First I’ve heard about “emissions uncertainty”. I thought the AGW folks were quite certain.

    ” The “budget” refers to measurements with fairly tight confidence bounds compared to the biosphere, not some fanciful target (see The Oceanic Sink for Anthropogenic CO2)”
    I don’t have a subscription for your url. However:
    From the IPCC TAR we read snip â??In principle, there is sufficient uptake capacity (see Box 3.3) in the ocean to incorporate 70 to 80% of anthropogenic CO2 emissions to the atmosphere, even when total emissions of up to 4,500 PgC (4500 Gt) are consideredâ?? (Archer etal., 1997).snip. That’s a 3400 Gt sink capacity, and we are talking about sinking less than another 1000 Gt at a rate of about 4 Gt/yr peak, for a very few years at peak rate. However, the 3400 Gt additional capacity, which would add less than 10% to the ocean inventory seems like a very low value for 3 reasons. First the equilibrium concentration is more than 3x the present concentration. Second, atmospheric concentrations were at least 5 times higher 100 million years ago, so seawater concentrations can be that much higher also. Third, experiments to test CO2 clathrate hydrate formation show formation at dissolved CO2 concentrations two orders of magnitude higher than the present concentration.

    “Sink saturation doesn’t imply decreasing sink flows ” I don’t understand this assertion. what does saturation imply?

    “we should be building models that fit all available data in a broad sense, including C12/C13 ratios, oxygen, spatial patterns, known features of ocean chemistry (Revelle factor), etc”
    Just to provide one example of the uncertainties, consider the IPCC contention of slow mixing due to the thermocline and see: http://www.aip.org/pt/vol-55/iss-8/captions/p30cap4.html See fig. 4
    The first thing to note about Fig. 4 is that there is no evidence at all of a thermocline barrier at near 200 m depth. At 30 degrees S in the Pacific the 50 umol/kg concentration extends to beyond 400 m and at about 20 degrees N in the N Pacific the 40 umol/kg concentration gets to 400 m. The mid latitude Pacific is relatively warm, has relatively low saline concentration and can therefore be expected to have relatively low total CO2 concentration. Forty umol/ kg would be
    about 2% anthropogenic CO2. The surface share of anthropogenic CO2 is about 2.5% in this region. Even though this is the zone that should have the strongest permanent thermocline, the anthropogenic concentration is well mixed way below the expected thermocline depth. In the colder and saltier N Atlantic, in the region which should at least have seasonal thermoclines, (30 to 60 degrees N), we find the anthropogenic share at 1.7% (65% of surface share) at a
    depth of 1200 m.
    We didn’t get to an ocean uptake equal to 10% of the last decade until about 1900, and yet we find the anthropogenic share equal to 10% of the surface share at a depth of >5000 m in the N. Atlantic.

    I’d love to see a link for this.
    Sorry, I c an’t find the specific url again. It was a nice curve of a model temperature projection with the x axis time scale stretched all the way across the screen so one could pick out temp. changes vs time very well. However eyeballing 3 different model projections we find very near 0.3 degrees C increase from 1997 to 2005, when the actual is about 0.03 degrees C. Snicker. Murray

    Comment by Murray duffin — 7 Sep 2005 @ 11:41 PM

  57. What do you mean by “a deceleration in emissions”? Is this a deceleration in the % rate of increse? what would that prove?
    Yes, a decline in the rate of increase, as in your table. If you drive a first-order system dCO2 = a*E – (CO2-CO2(0))/tau with growing emissions, then slow or reverse the growth rate, the observed airborn fraction declines because you’ve changed the relationship between E and (CO2-CO2(0)), which would otherwise be constant along a steady state growth path. I think that’s a general behavior that remains true even if you properly model the system as higher-order and nonlinear.

    http://www.aoml.noaa.gov/ocd/gcc/co2research
    The key quote from this url is “The global oceanic CO uptake using different wind speed/gas transfer velocity parameterizations differs by a factor of three …

    Based on a quick look, that level of uncertainty strikes me as a consequence of the openness of the model. As soon as you require closure of the global budget and constrain it with other measurements, the bounds have to close – either you get a good estimate of the flux, or you discover something else that has to covary negatively to offset the large uncertainty. The Sabine et al. 1800-1994 ocean budget measurement of 118+-19GtC is not 3x uncertain.

    The IPCC downplay this possibility, but do not outright deny it, which suggests a fairly high degree of probability to me.
    Would outright denial suggest certainty? :)

    “Second, the observations of the airborn fraction are conditioned on “assuming terrestrial uptake closely matches land use emissions,” which is not necessarily a good assumption given the large error bounds (>50%) on land use emissions data”
    It’s the IPCC assumption.

    Roughly constant airborn fraction may be an IPCC observation. The IPCC may also assume the terrestrial/land use match in constructing emissions trajectories, but the models are used to test that idea not to blindly instantiate it. Neither the IPCC models Bern/HILDA/ISAM nor their predecessors Oeschger/Goudriaan & Kettner/Siegenthaler etc. assume anything about the airborn fraction because it’s not a physical parameter of the models; it emerges from the interaction of other features. The skeptics on the Daly site by contrast treat it with the reverence usually reserved for Planck’s constant or pi.

    ” The “budget” refers to measurements with fairly tight confidence bounds compared to the biosphere, not some fanciful target … From the IPCC TAR we read snip â??In principle, there is sufficient uptake capacity (see Box 3.3) in the ocean to incorporate 70 to 80% of anthropogenic CO2 emissions to the atmosphere…
    True, but irrelevant because the time constant is so long. “Over the long term (millennial time scales) the ocean has the potential to absorb as much as 85% of the anthropogenic CO that is released into the atmosphere” (Feely et al.)

    Second, atmospheric concentrations were at least 5 times higher 100 million years ago, so seawater concentrations can be that much higher also.
    I don’t think we want to go there!

    “Sink saturation doesn’t imply decreasing sink flows ” I don’t understand this assertion. what does saturation imply?
    Decreasing marginal uptake, i.e. as concentration grows, the effective time constant for storage lengthens, e.g. due to buffer chemistry as in Feely cited above or to dissolution of biota at depth from acidification.

    Just to provide one example of the uncertainties, consider the IPCC contention of slow mixing due to the thermocline and see: http://www.aip.org/pt/vol-55/iss-8/captions/p30cap4.html See fig. 4
    The first thing to note about Fig. 4 is that there is no evidence at all of a thermocline barrier at near 200 m depth. … We didn’t get to an ocean uptake equal to 10% of the last decade until about 1900, and yet we find the anthropogenic share equal to 10% of the surface share at a depth of >5000 m in the N. Atlantic.

    Again, though, a critique of the IPCC wording is not as good as a critique of the models. The wording often abstracts to broad features that aren’t necessarily reflected in model structure, so identifying sub-thermocline C doesn’t overturn the models. For a good discussion see Cao & Jain.

    It was a nice curve of a model temperature projection with the x axis time scale stretched all the way across the screen so one could pick out temp. changes vs time very well. However eyeballing 3 different model projections we find very near 0.3 degrees C increase from 1997 to 2005, when the actual is about 0.03 degrees C. Snicker.
    A short time horizon like that seems like weather, not climate, to me. Over that time scale the correct question would be whether temperature remained within the envelope of natural variability around the mean of an ensemble of GCM runs. Seems premature to snicker.

    Comment by Tom Fiddaman — 8 Sep 2005 @ 1:49 PM

  58. “It seems possible that carbon fertilization plus warming considerably accelerate growth of ocean biota.” My short time in the scientific community was in marine biology so I will attempt to respond to this.

    Anthropogenic climate is very unlikely to have a positive affect on the ocean ecosystem.

    First, Carbon fertilization
    Because the majority of photosynthesis occurring in the oceans is done by phytoplankton so they would be the organisms fertilized. Additional CO2 would not have a fertilizing effect because CO2 is not a limiting factor in phytoplankton growth and reproduction. In most areas of the oceans there is an excess of CO2 to support phytoplankton but a lack of other nutrients. Iron is often the key one, thus leading to iron fertilization proposals.

    Even if CO2 did have a fertilizing effect, increasing primary productivity (plants, in this case mostly algae) could have negative effect on ecosystems. Anthropogenic phosphate and nitrogen have allowed algae to overgrow corals and have harmed coral reefs. In enclosed estuaries anthropogenic urban and agricultural runoff has caused explosive algae growth then a population crash (eutrophication) that has destroyed local ecosystems. In addition Carbon fertilization will not affect all species equally and this could lead to cascading effects through an ecosystem that are difficult to predict and could be harmful.

    The increasing amounts of CO2 in the oceans are changing the chemistry of the oceans and acting as a pollutant. There has been a drop in the ph of the oceans caused by anthropogenic CO2. The effects of this are only beginning to be understood but the preliminary data is not good. See the Acid Ocean Post here on RC.

    Second, Warming Accelerating Biota Growth
    There have been short-term local cyclic temperature changes in the oceans (NAO and ENSO) and ecosystems have not reacted to warming by the individual species growing faster. Instead there have been changes in the relative abundances of individual species due primarily to geographical shifts. Ecosystems and species vary greatly in their ability to move and have at times been unable to. This has resulted in widespread destruction of Indo-Pacific coral reefs during recent ENSO events.

    The predicted AGW changes differ from these natural changes because of AGW’s predicted greater geographical range, duration and amount of warming. The scope of these changes could, and in cases of some ecosystems like coral reefs probably will, overwhelm the ability to respond by shifting their geographically ranges. There is a very real possibility of species being driven to local or global extinction by AGW. There is a large body of fossil data of species that have become extinct due primarily to climate changes.

    Some of the warmest areas of the oceans are the biologically least productive, the ocean’s equivalent of deserts. As with fertilization warming temperatures will not affect all species equally and this could lead to cascading effects through an ecosystem that are difficult to predict and could be harmful.

    Good sources on this are the scientific literature summaries from Pew
    http://www.pewclimate.org/global-warming-in-depth/environmental_impacts/reports/

    Comment by Joseph O'Sullivan — 8 Sep 2005 @ 6:13 PM

  59. Regarding 56 and 57. Wow, that looks like a good discussion. I don’t feel that I have time to understand it. Can (at least) one of you summarize your positions? I think 56 is saying that sinks are not approaching saturation so, despite the continually increasing CO2 concentrations, AGW won’t be so bad for very long. I think 57 says that the time before that extra anthropogenic CO2 gets taken up by the ocean will be very long, so AGW won’t be ameliorated very much on a meaningful time scale no matter if sinks are even essentially infinite.

    Just to support #58 (IMO a good summary, or at least I understand it) — carbon fertilization certainly doesn’t seem to be moving up any food web and helping fisheries thus far!

    And finally, apparently there’s a new Nature paper coming out or already out suggesting that a lot of CO2 has been released from soils (doesn’t say in this article how much has been taken up by terrestrial plants…)
    http://www.enn.com/today.html?id=8740
    I’ll be interested to read about how this study meshes with this earlier post by Corinne Le Quere on RC http://www.realclimate.org/index.php?p=160

    Comment by Steve Latham — 8 Sep 2005 @ 7:38 PM

  60. Re 58: How does the following factor in

    http://earthobservatory.nasa.gov/Newsroom/MediaAlerts/2005/2005081820000.html And now we find the most abundant microbe in the ocean â??plays a huge role in the cycling of carbonâ??, and is â??a major consumer of the organic carbon in the oceanâ??. It is almost certain that the full role of SAR11 is not well understood, and is not factored into GCMs, nor into the IPCC estimates of ocean uptake. It might be a major mechanism for the accelerated C uptake by the ocean already noted. Murray

    Comment by Murray duffin — 8 Sep 2005 @ 9:47 PM

  61. Re #60
    Why do you think SAR 11 might be “a major mechanism for the accelerated C uptake by the ocean”?

    The role of bacteria especially very small ones in ocean ecosystems is something that really started to be examined in the late eighties so most of this science is fairly new.

    SAR 11 (Pelagibacter ubique, strain HTCC1062) is a heterotroph so consumes organic carbon and it is not able to use the inorganic carbon from CO2 that dissolves in ocean waters. These bacteria are still dependent on photosynthetic organisms to convert the inorganic carbon from CO2 into organic carbon.

    Since SAR 11 cannot take out the CO2 itself can you posit a mechanism or pathway from photosynthetic organisms to SAR 11 that would account for the increased uptake?

    Comment by Joseph O'Sullivan — 8 Sep 2005 @ 11:36 PM

  62. Re #59:

    You mention a new Nature paper on CO2 release from soils. A summary can be found at
    http://www.silsoe.cranfield.ac.uk/nsri/research/carbonloss.htm

    The summary includes such unsupported statements as “The rate of loss increased linearly with soil carbon content, and this relationship held across all forms of land use as shown in the next figure, suggesting a link to climate change.” No such suggestion is actually shown. Maybe the full study is more revealing.

    Also, the author states that “with the present data, we cannot say where the carbon has gone, whether to air or drainage.” So in other words, this paper may or may not be relevant to climate change; we simply do not have enough information.

    Another consequence not mentioned in the summary is how this extra natural CO2 over the last 25 years (if it went to the atmosphere as he seems to suppose) means that anthropogenic CO2 had that much less of an impact. The rate of loss is estimated to be about 8.6% of industrial emissions.

    Comment by J. Sperry — 9 Sep 2005 @ 10:47 AM

  63. Thanks J. Sperry for the link and discussion. I hope to get to a library and see the full study sometime this month.

    Comment by Steve Latham — 9 Sep 2005 @ 4:26 PM

  64. “Retreating glaciers, melting permafrost threaten Arctic lifestyle”:

    http://us.cnn.com/2005/TECH/science/09/12/greenland.arctic.thaw.ap/index.html

    Comment by Stephen Berg — 13 Sep 2005 @ 8:43 AM

Sorry, the comment form is closed at this time.

Close this window.

0.414 Powered by WordPress