RealClimate

Comments

RSS feed for comments on this post.

  1. good job
    Thank you

    Comment by Pepe Larios — 1 Jun 2008 @ 11:27 AM

  2. Nice!

    Comment by Figen Mekik — 1 Jun 2008 @ 12:22 PM

  3. I would count Roger Pielke Jr. out. Take a look at his latest post.

    Thompson et al. do not provide a time series estimate on the effects of the bias on the global temperature record, but Steve McIntyre, who is building an impressive track record of analyses outside the peer-review system, discussed this topic on his weblog

    As you can see, Pielke Jr. is not afraid to bring in the heavy thinkers when it’s required.

    Comment by Thom — 1 Jun 2008 @ 12:45 PM

  4. Gavin,

    I would be careful if I was you. Citing Roger Pielke Jr. only gives him respectability, creates doubt, and results in complacency.

    The Arctic sea ice is rapidly disappearing with a time scale of years rather than decades, and it is becoming increasingly likely that the with the Greenland ice sheet will also collapse rapidly. Droughts are contributing to food shortages, and destructive cyclones and hurricanes seem to be increasing. Wild fires, a symptom of desertification are being reported from Alaska to California, and from the north Mediterranean coast of Europe to the main populated regions of Australia. We should be concentrating on the dangers happening now and in the future, not allowing ourselves to be distracted by minor inconsistencies in the past with no relevance to the future.

    Arguing about a 0.1C difference in SSTs during 1945 is totally irrelevant and only leads to a false optimism that the main problem of cutting fossil fuel use by 90% can be left for the next generation. The Arctic sea ice will have vanished long before then :-(

    Cheers, Alastair.

    Comment by Alastair McDonald — 1 Jun 2008 @ 2:33 PM

  5. Re #4. although you are right Alastair you cannot keep optimists down and the world abounds with peope in positions of influence who are optimistic about everything. Realism aint their thing but optimism is, if the world aint as they want it to be then change it and make it that way.

    Comment by pete best — 1 Jun 2008 @ 3:45 PM

  6. Sorry for being off-topic. I read a number of alternative financial sites. Along with being contrarian about economic issues, most posters reject AGW. One in particular is TickerForum. The site is owned by Karl Denninger, who posts as Genesis. In a thread from a couple weeks ago, he has posted what seem like softball questions. If my math were better I would debate him myself. If anyone has the energy to get into it with another denier I think it would be beneficial if you could change his mind as he is fairly influential with the people who read his site.

    A link to the thread below and his questions.
    You will need to register with the site to post.

    http://www.tickerforum.org/cgi-ticker/akcs-www?post=44720&page=3

    —————————————–
    1. The percentage of contribution for each greenhouse gas out of the whole.

    2.The measurement for man’s contribution to CO2, expressed as a scientific measurement (pick your units but they must match the first answer), and that measurement must include an uncertainty.

    We’ll ask some tougher questions (which I suspect you won’t like) once you produce the first two answers.

    ——————————————
    Again – how much of the CO2 generated in a given year is man-made?

    In scientific form please.

    Same for methane.
    —————————————–

    [Response: It's a set up since the questions are ill posed, but he thinks he's knows the answer (but he will be wrong). For the current situation CO2 provides about 20% of the greenhouse effect (water vapour is about 50% and clouds about 25%, ozone and other minor gases make up the rest) (defined as the net reduction in the difference between longwave emitted from the surface and the longwave emitted to space, uncertainties are a few percent maybe). For CO2, the anthropogenic component is about 27%, methane it's 60%, N2O it's about 13%, CFCs it's 100% (uncertainties of a couple of percent). His second set of questions are either irrelevant or make no sense. Last year, we emitted about 9 GtC, and increased concentrations by ~2ppm CO2. Your chap will most likely respond with a reference to a website run by mhieb (see here for more debunking). - gavin]

    Comment by Geoffrey — 1 Jun 2008 @ 4:06 PM

  7. I respond to this post here:

    http://sciencepolicy.colorado.edu/prometheus/archives/climate_change/001446real_climate_on_mean.html

    [Response: Roger, when you've calmed down please do let us know when you: a) work out the difference between the ocean and the global temperature, and b) realise that there is difference between getting criticised for asking questions, and for getting the answer wrong (the latter being the central point here). Thanks! - gavin]

    Comment by Roger Pielke, Jr. — 1 Jun 2008 @ 7:04 PM

  8. Re #5 Hi Pete,

    I felt I was being rebuked when you wrote:

    … if the world aint as they [optimists] want it to be then change it and make it that way.

    I am trying to change it, but no one listens. Besides it was Professor Martin Parry who wrote “A curious optimism …” not me.

    But I am no longer optimistic about the future. Only tonight I heard a Russian scientist say about global warming: “People claim that there is not enough evidence to act. I say there is not enough time.”

    Optimism is fine when things are going well. Now is the time for realism :-(

    Cheers, Alastair.

    Comment by Abbe Mac — 1 Jun 2008 @ 7:06 PM

  9. Why is it when I read this posting I think of an episode of “Yes Minister”. The minister asks Sir. Humphrey what to since the figures used by the ministry have been shown to be clumsily and obviously incorrect. Sir Humphrey responds by noting that while the figures could be considered incorrect that they are incorrect in a very complicated and learned way. That while strictly speaking they are wrong’ tey are wrong in a very educated way

    Comment by Tom Gray — 1 Jun 2008 @ 7:29 PM

  10. Re #7

    Just as I predicted. Roger Pielke is luring you into a slanging match with inviting remarks like “There is a lot of science and civil discussion there, with a healthy mix of assorted experts and a range of ordinary folks” but absolutely no science at all in his post at: http://sciencepolicy.colorado.edu/prometheus/archives/climate_change/001446real_climate_on_mean.html

    HTH,

    Cheers, Alastair.

    Comment by Abbe Mac — 1 Jun 2008 @ 7:29 PM

  11. >luring
    That’s like trolling, but with simulated rather than edible bait.

    Comment by Hank Roberts — 1 Jun 2008 @ 7:57 PM

  12. Hi Gavin-

    I’d be happy to work from a proposed adjustment directly from you, rather than rely on the one proposed by Steve McIntyre or the one you point to from The Independent.

    Thompson et al. write: “The new adjustments are likely to have a substantial impact on the historical record of global-mean surface temperatures through the middle part of the twentieth century.”

    It is hard to see how temperatures around 1950 can change “substantially” with no effect on trends since 1950, but maybe you have a different view. Lets hear it. Give me some better numbers and I’ll use them.

    [Response: Nick Rayner, Liz Kent, Phil Jones etc. are perfectly capable of working it out and I'd suggest deferring to their experience in these matters. Whatever they come up with will be a considered and reasonable approach that will include the buoy and drifter issues as well as the post WW-II canvas bucket transition. Second guessing how that will work out in the absence of any actual knowledge would be foolish. - gavin]

    Comment by Roger Pielke, Jr. — 1 Jun 2008 @ 8:24 PM

  13. One of the things I like about RealClimate is how easy the articles are to read and understand, compared to the slanted ramblings on the denialist-delayer blogs. I went and looked at that post of Roger’s – what a missmash! And of course he reaches into his bag of tricks and …

    A. Starts his plot halfway through the data record so that he can argue that there has been a reduction of 20th century warming by 30% – when if he had started the plot before the period in question he would not be able to make that assertion.

    B. Takes an adjustment to sea temperatures in a defined period and implies that it impacts the global mean temperatures trend estimates over the entire twentieth century.

    C. He attributes the plot to RealClimate!!!

    I’m fascinated that he is now accusing RealClimate authors of writing in a ‘passive’ manner, like it is some kind of devious technique.

    I suspect that people relying on such sources must think the science is not reliable because it appears to be confused. But it’s the denialist-delayers who are working hard to make it seem so.

    —–
    It would actually be really interesting to see a series of plots that show how the datasets of measured sea and land temperatures have evolved over time as they have been improved with adjustments such as this.

    Comment by Craig — 1 Jun 2008 @ 8:57 PM

  14. Question as to how this might relate to GISTEMP.

    GISTEMP uses Smith et al (1996) and Reynolds & Smith (1994) for SST, but those papers document SST interpolation methods. Who performs the underlying corrections used for the in situ (pre 1982) data in the current GISTEMP analysis? The satellite sst measurements (1982+) are said to be calibrated using “quality controlled” buoy data, so I presume corrections are rarely necessary with them.

    Thanks!

    [Response: GISTEMP uses HadISST up to 1982 and Reynolds thereafter - there may still be issues in melding the two approaches because small potential offsets between the buoys and other methods of determining SST. - gavin]

    Comment by cce — 1 Jun 2008 @ 9:22 PM

  15. Cautionary as Nature’s account of this tale of a tub may be,it is hopelessly outclassed by this week’s Culture Of Modeling news-

    La Scala has commissioned a full length opera based on
    An Inconvenient Truth.

    Stay tuned for the encore :
    http://adamant.typepad.com/seitz/2008/05/hot-ticket.html

    Comment by Russell Seitz — 2 Jun 2008 @ 2:08 AM

  16. re Response to #6

    “For CO2, the anthropogenic component is about 27%”

    Pre-industrial CO2 c.280ppmv
    Present CO2 c.385ppmv

    Shouldn’t your 2 be a 3 Gavin? ie

    “For CO2, the anthropogenic component is about 37%”

    Otherwise, great. Thanks.

    [Response: No. 105/385 = 27% of the current concentration is anthropogenic. 37% is the rise over pre-industrial levels, but that wasn't what was asked for. - gavin]

    Comment by Slioch — 2 Jun 2008 @ 5:34 AM

  17. Modifying the adjustments will not change the overall 20 century trend. Modifying the adjustments will, however, change the perception of how much we warmed in the first half of the century, and how much in the second half, possibly invalidating claims that we are warming faster now. And of course any sea-surface temperatures history change will affect global temperature history change to a great extent. Most of the Earth’s surface is ocean.

    [edit]

    Comment by Nylo — 2 Jun 2008 @ 5:46 AM

  18. You know I was just looking through some old papers and came across a few records… worth reviewing now, I think:

    “In 1989, not long after James Hansen’s highly publicized testimony before Congress and shortly after the first meeting of the UN’s Intergovernmental Panel on Climate Change, the Burson Marsteller PR firm created the Global Climate Coalition (GCC). Chaired by William O’Keefe, an executive for the American Petroleum Institute, the GCC operated until 1997 out of the offices of the National Association of Manufacturers. . .

    “The GCC has also used “Junkman” Steven Milloy’s former employer, the EOP Group, as well as the E. Bruce Harrison company, a subsidiary of the giant Ruder Finn PR firm. Within the public relations industry, Harrison is an almost legendary figure who is ironically considered “the founder of green PR” because of his work for the pesticide industry in the 1960s, when he helped lead the attack on author Rachel Carlson and her environmental classic Silent Spring…”

    “Industry’s PR strategy with regard to global warming issue is also eminently practical, with limited, realistic goals. Opinion polls for the past decade have consistently shown that the public would like to see something done about the global warming problem, along with many other environmental issues. Industry’s PR strategy is not aimed at reversing the tide of public opinion, which may not in any case be impossible. Its goal is simply to stop people from mobilizing to do anything about the problem, to create sufficient doubt in their minds about the seriousness of global warming that they will remain locked in doubt and indecision…”

    “In 1991, a corporate coalition composed of the National Coal Association, the Western Fuels Association, and Edison Electrical Institute created a PR front group called the “Information Council for the Environment” (ICE) and launched a $500,000 advertising and public relations campaign to, in ICE’s own words, “reposition global warming as theory, (not fact)…”

    “To boost its credibility, ICE created a Scientific Advisory Panel that featured Patrick Michaels from the Dept of Environmental Services at the University of Virgina; Robert Balling of Arizona State University, and Sherwood Idso of the U.S. Water Conservation Laboratory. ICE’s plan called for placing these three scientists, along with fellow greenhouse skeptic S. Fred Singer, professor emeritus of environmental sciences at the University of Virgina, in broadcast appearances, op-ed pages, and newspaper interviews. Bracy Williams & Co., a Washington-based PR firm, did the advance publicity work for the interviews…”

    “…According to Peter Montague of the Environmental Research Foundation, “S. Fred Singer is now an ‘independent consultant’ for companies including ARCO, Exxon Corporation, Sheel Oil Company, Sun Oil Company, and Unocal Corporation. Rather than conducting research, Singer “spends his time writing letters to the editor and testifying before Congress….”

    “In April 1998…the New York Times reported on yet another propaganda scheme developed by the American Petroleum Institute. Joe Walker, a public relations representative of the API, had written an eight-page internal memorandum outlining the plan, which unfortunately for the plotters was leaked by a whistle-blower. Walker’s memorandum called for recruiting scientists “who do not have a long history of visibility and/or participation in the climate change debate.” Apparently, new faces were needed because the industry’s long-standing scientific front men – Balling, Michaels, Idso, and Singer – had used up their credibility with journalists.”

    -”Global Warming is Good For You”, Ch. 10, in “Trust Us, We’re Experts” by Rampton & Stauber (2001)

    Well- that’s interesting. I wonder who they decided to go with? I imagine it’s not too hard to find out – just look at the most widely promoted and quoted “climate science skeptics” in the leading press outlets over the past 8 years or so.

    Comment by Ike Solem — 2 Jun 2008 @ 8:03 AM

  19. RE_ #The Article says: The biggest transitions in measurements occurred at the beginning of WWII between 1939 and 1941 when the sources of data switched from European fleets to almost exclusively US fleets (and who tended to use engine inlet temperatures rather than canvas buckets).#
    In the Pacific the US shipping dominated since the ambush on Pearl Harbor, but in the North Atlantic most tonnage was under British, Norwegian and Neutral Flag. The specific war situation concerning SST is explained in paper from 1997 and 1998, http://www.oceanclimate.de/English/Pacific_SST_1997.pdf , http://www.oceanclimate.de/English/Atlantic_SST_1998.pdf

    Comment by ab — 2 Jun 2008 @ 8:55 AM

  20. There is no hint in Rayner06 of a reversion to canvas buckets after WWII. There is a quantitative effect of this error, both on global average calculations up to the 1970′s and on the uncertainty of that number. Your post does not calculate that effect. It would appear that aerosols explaining cooling into the 1970′s has to be revisited, is there a quantative analysis of that (i.e. including actual aerosol measurements)? Another correlation to be investigated is solar since (nonquantitatively speaking) warming started more in the 40′s than in the 70′s as previously thought which is more in line with solar trends (modern maximum).

    [Response: Not so - the quoted section clearly refers to that. As for the 40s-70s cooling, this is still seen in the land data and is not tied to SST changes in 1945. The first cut at the revisions linked above has effectively the same match to the model trends as before (maybe a little better) and so no revisions to the models nor to attribution studies are likely. - gavin]

    Comment by Eric (skeptic) — 2 Jun 2008 @ 8:59 AM

  21. In June 2008, Miami artist Xavier Cortada will travel to the North Pole, ninety degrees North (90N), to create site-specific installations exploring our connection to the natural world.

    The work addresses global climate change and will include the reinstallation of the “Longitudinal Installation” and “Endangered World” (originally created in the South Pole, 90S, during January 2007 as part of a National Science Foundation Antarctic Artist and Writers residency).

    The artist will also plant a green flag at at the Earth’s northernmost point to encourage the reforestation of native trees in the world below.
    At a time when melting polar sea ice is causing so many to focus on which political power will place its flag over the Arctic, controlling the Northwest Passage shipping lanes and the petroleum resources beneath the sea ice, Miami artist Xavier Cortada has developed a project that engages people across the world below to plant a green flag and native tree to help address global climate change. Reforestation helps prevent the polar regions from melting.

    Cortada will plant a green flag the North Pole when he arrives there on June 30, 2008. On that same day folks from around the world will be asked to also plant a green flag and native tree in their community.

    Miami artist Xavier Cortada created Native Flags as an urban reforestation project to help restore native habitats for plants and animals across South Florida. Launched last year at the Miami Science Museum, Native Flags now calls on individuals globally to join the effort.The conspicuous green flag serves as a catalyst for conversations with neighbors and a call to action to help rebuild our native tree canopy. Community leaders can model the behavior by planting a native tree and green flag at their science centers and city halls.

    To learn about this please visit the artist’s website at http://www.xaviercortada.com.

    Comment by anonymous — 2 Jun 2008 @ 9:04 AM

  22. Gavin, wouldn’t the difference between northern and southern hemisphere trends (e.g. http://cdiac.ornl.gov/trends/temp/jonescru/graphics/nhshgl.jpg) indicate an potential SST sampling error, especially right after WWII?

    [Response: Why? Hemispheric trends can differ due to different forcings, different thermal inertia, different impacts of ocean circulation change etc. The sampling issue is much worse in the SH and so the uncertainty there is greater, but you can't automatically associate a hemispheric difference with an artifact. - gavin]

    Comment by Eric (skeptic) — 2 Jun 2008 @ 9:20 AM

  23. OK, this is a question only about the “blog” part of this topic.
    We need help over at Dot Earth, again.
    Please see this comment:

    #60. June 2nd, 2008, 7:28 am

    Tenney,

    The graphs are of actual measurements, so it doesn’t matter what our opinions are of Dr. Spencer.

    Just exactly how has Dr. Spencer been discredited?

    The most important greenhouse gas, water vapor, overwhelms all the others combined, including methane. Apparently, the water vapor is self regulating, more than offsetting all the other greenhouse gases combined. Simple precipitation counterbalances the thermal forcing of much of the growth of CO2 and others.

    Why did the globe gradually warm then in the latter part of the 20th century? Increased solar activity.

    Why is the globe cooling now? Decreased solar activity, with the current 23rd solar cycle yet to end.

    It is only a matter of time before the thermal momentum of the latter part of the 20th century is reversed, and the northern polar cap begins to cool back down, reforming the ice. The southern polar region has already been cooling for several years, with an increasing ice cover.

    By the way, did you catch this story?

    http://www.theglobeandmail.com/servlet/story/LAC.200805 24.TRIPPING24/TPStory/

    The Northwest Passage is not quite as easy to navigate as we have been led to believe.

    Eppure, si rinfresca

    — Posted by Jack Simmons

    That comment was in response to my comment #56, a response to his comment #54.

    The link to the blog thread is this one:

    http://dotearth.blogs.nytimes.com/2008/05/29/white-house-poor-face-health-risks-from-warming/

    I would appreciate your help because you guys are so much better at this than I could ever think about being.

    Thank you.

    [Response: Your correspondent can't possibly believe that climate sensitivity is zero (due to some magical stabilisation mechanism) and also claim that the sensitivity to solar is enough to explain the 20th C trend. Either water vapour is stabilising or it isn't (and it isn't by the way - as shown by the water vapor changes during ENSO, after Pinatubo, the long term trends etc. - see Soden et al, 2001;2003;2004). And why would anyone think that the NWP is navigable easily in May? Even in 2007, the opening was in the second half of August... more FUD. - gavin]

    Comment by Tenney Naumer — 2 Jun 2008 @ 11:19 AM

  24. The most revealing item is Gavin’s “Colocated Anomalies” graph showing the relationship between corrected SST (HadSST2 in red) and marine air temperatures (NMAT in green). The air temperatures were not susceptible to bucket or engine inlet artefacts. The first thing that stands out is that the mid-century aberrations are not merely a “dip” after 1945, but also a peak that preceded it. Here, both curves coincide during the ascending and part of the descending limbs of the peak, and the upward and descending limbs may be attributable, at least in part, to El Nino years between 1939 and 1942, a transient downturn in the AMO (Atlantic Multidecadal Oscillation) warm phase and a more persistent shift from warm to cool in the Pacific Decadal Oscillation) in the mid-1940s, among other natural events.

    Strikingly, both curves show the post-1945 “dip”, with the main difference residing in the deeper decline of the SST curve, and its more sudden drop. (The differences are even better seen in IPCC AR4, WGI, Chapter 3, p.246, Fig. 3.4a).

    It seems plausible, therefore, to conclude that most of the peak and dip were not caused by SST measurement artefacts, and presumably reflected natural events, with the artefactual component mainly limited to the modest excess in depth of the SST curve between 1945 and about 1956, rather than the entire dip.

    If this interval is corrected, it appears that the 1945-1975 interval of the SST curve may become slighly flatter, consistent perhaps with a slightly greater aerosol masking of greenhouse gas forcing than previously estimated. Probably more important, most sections of the twentieth century warming curve will not be substantially affected, and should not require significant reinterpretation.

    Comment by Fred Moolten — 2 Jun 2008 @ 11:51 AM

  25. Re: comment #23

    Dear Gavin,

    Thanks for your response. Of course he does not believe what he is writing — he is one of those industry-paid denialists that show up on Dot Earth to confuse the uninitiate. I have to use layman’s language to explain to other readers where he is fantastically incorrect, but as I am not a scientist, it is not so easy for me to do that.

    Comment by Tenney Naumer — 2 Jun 2008 @ 12:06 PM

  26. Tenney, Your correspondent not only does not believe what he is writing, he doesn’t understand it. It is pure gobbldygook. Water vapor as a stabilizing (negative) feedback–what happens to water content as temperature goes up? Moreover water vapor peters out at cloudtop level, while CO2 stays well mixed into the stratosphere. Then there is the relative lifetime of H2O and CO2–days to hundreds of years.
    This guy is just parroting back what he’s heard/read somewhere else, and he’s not even doing a particularly good job of parroting. Go back and read “A saturated Gassy Argument” and Gavin’s posting about greenhouse forcing in 6 easy steps. Understand those and you’ll blow this guy out of the water.

    Comment by Ray Ladbury — 2 Jun 2008 @ 12:29 PM

  27. #23, Terry, I would like to read that Globe and Mail story (link is a dud). Living next to the Northwest Passage, I must say, if it infers in anyway that it is just as hard to Navigate this passage as with any previous times during the shipping season, it is absolute hogwash. It necessarily might have omitted that it is also easier to snowmobile on the passage during winter, reason: no old ice, virtually none on sight near Arctic communities. RC press rebuttals should be syndicated in every news outlet out there, correct interpretations of climate science is regularly mangled, to the point where I get Arctic visitors, some journalists, who regularly quote bad science from misleading news sources, newspaper stories are considered like science journals, peer reviewed quoted news stories especially, namely that 10 year cooling German model forecast. Like Alastair says. no time to trivialize how serious the disappearance of old multi year ice, but it seems that this mega story gets less attention than a Mars landing. But a rapid response to immediately kill mischievous news stories is apparently, not quick enough, syndicate guys!

    Comment by Wayne Davidson — 2 Jun 2008 @ 12:36 PM

  28. #23 re. Tenney Naumer “The Northwest Passage is not quite as easy to navigate as we have been led to believe.”

    Well according to a Science Journal article it was navigatable enough for two “huge dry docks” to be taken through to the Bahamas in 1999:

    http://www.sciencemag.org/cgi/content/full/291/5503/424?ck=nck

    Comment by Richard Ordway — 2 Jun 2008 @ 12:41 PM

  29. > navigatable

    All I can see is the Abstract, here:

    http://www.sciencemag.org/cgi/content/full/291/5503/424

    I’d speculate they used icebreakers.
    Anyone know for sure?

    “… In 1999, Russian companies sent two huge dry docks to the Bahamas through the usually unnavigable Northwest Passage ….”

    Science 19 January 2001:
    Vol. 291. no. 5503, pp. 424 – 425
    DOI: 10.1126/science.291.5503.424

    Prev | Table of Contents | Next
    News Focus
    ECOLOGY:
    Arctic Life, on Thin Ice
    Kevin Krajick

    Field observations from the Beaufort Sea to Hudson Bay are suggesting that the food web in the Arctic Ocean is ailing, causing many species to flounder as a result of the warming environment. Sea ice in the Arctic, on which arctic animals hunt, rest, and reproduce, now covers 15% less area than it did in 1978; it has thinned to an average of 1.8 meters, compared to 3.1 meters in the 1950s. If this trend continues, in 50 years the sea ice could disappear entirely during summers–possibly wiping out ice algae and most other organisms farther up the food chain.

    Comment by Hank Roberts — 2 Jun 2008 @ 1:29 PM

  30. I’m sorry but I don’t think the contributors to this website fully appreciate how bad this “bucket” saga really is. As someone who was using a bucket to obtain water samples on a UK registered survey vessel during the years 1981/2, I can testify that the quality of data obtained was not high – be it a wooden, canvas or plastic bucket that was employed. Even if there was a standardised proceedure for measurement- and it’s my understanding there was no universal proceedure- I suspect those proceedures would not have been adhered to on shipboard conditions. It was bad enough slinging a bucket over the side in the North Sea; I would hate to have been the deckhand assigned the job in the South Atlantic. Working at sea, especially in “weather”, has to be experienced to be appreciated.

    Further, conscientiously done work could be undermined by slight positional changes by the ship; a vessel could move from a warm current area to a cold current within a few hundred yards and the temperature changes could be of the order of degrees. Off the Outer Banks where the Gulf Stream veers off into the Atlantic is an example. Frankly, when sea temperature measurements were being taken using buckets, few envisaged those measurements would be read as precision readings. I would suggest that much of data collected had errors margins of +/- 2C.

    This being the case, I believe the whole area of oceanic temperature measurements should be comprehensively re-examined.

    Comment by Martin J — 2 Jun 2008 @ 2:18 PM

  31. Tenney #25, what Ray said.

    What you should clearly understand here is that the water vapour doesn’t know what warms the air it is in. Your correspondent effectively claims that the water vapour produces a strong negative feedback (in whatever way; who cares) reducing total sensitivity. But this feedback is mediated through tropospheric temperature. If it happens for CO2 radiative forcing, it will also happen for Solar activity -related (or any other) forcing, which thus would be equally ineffective to explain late 20th century warming. This is Gavin’s argument in layman’s terms (I hope).

    …and there is a second impossibility compounding the first one: during the last few decades of accelerating global warming, the Sun has been very precisely monitored from space, and its brightness hasn’t shown any discernable trend. Same for solar activity (sunspots), interplanetary magnetic field, cosmic rays… so you would have to first explain away a known and understood forcing mechanism (CO2) and then replace it by one that cannot even work because it has the wrong time signature.

    Comment by Martin Vermeer — 2 Jun 2008 @ 2:32 PM

  32. I confess to a little confusion from your post.

    Let’s put the details aside fro the moment.

    At some time in the past measurements were predominantly by bucket.

    At some time in the recent past (arbitraily, say 2000) readings were predominately by water intake.

    Buckets apparently read colder than inlets by some amount, possibly .3 degrees.

    So the change in the measurement, regardless of the intervening use of different mixes of buckets and other means, overall would be about .3 degrees between the “past” and the “recent past”..

    Is this currently factored out of the trends or not?

    Comment by John Lederer — 2 Jun 2008 @ 2:48 PM

  33. For Richard Ordway, re #28, Google is your friend, albeit not a wise advisor. Tell your friends they did use an icebreaker – and had trouble doing the job:

    http://www.cargolaw.com/presentations_casualties_a.html

    Russian icebreaker M/V ADMIRAL MAKAROV (14,058 gt, built 1975), towing a dry dock to the Bahamas, had towline break on Oct 14 and the dry dock was adrift off eastern Canada in the Cabot Strait between Nova Scotia and Newfoundland. The dry dock, with 15 people on board, was adrift in heavy seas for 2 days before being taken back under tow by the Admiral Makarov on Oct 16. All 15 crew members were reported in good health. (Mon. Oct. 18 1999)

    Comment by Hank Roberts — 2 Jun 2008 @ 3:32 PM

  34. Hank Roberts (29)
    >I’d speculate they used icebreakers.
    Anyone know for sure?

    “…In 1999, the first non-American
    passage for commercial shipping pur-
    poses took place when a Russian com-
    pany sold a floating dry dock based in
    Vladivostok. Its new owners decided
    to move the dock to Bermuda. With
    the aid of a Russian icebreaker and an
    ocean-going tug, the dry dock was
    successfully towed through the
    Passage…”

    http://www.navyleague.ca/eng/ma/papers/huebert_e.pdf

    Comment by Arch Stanton — 2 Jun 2008 @ 3:35 PM

  35. Re 4 … The Arctic sea ice is rapidly disappearing …

    The latest data by NSIDC for Arctic sea ice extent shows that 2008 ice coverage has fallen to 2007 levels for the end of May:

    http://nsidc.org/arcticseaicenews/

    Also see ice age (Figure 4):

    http://nsidc.org/arcticseaicenews/2008/040708.html

    Comment by pat n — 2 Jun 2008 @ 3:40 PM

  36. P.S. for Richard Ordway:

    http://www.internationalliving.com/var/ezwebin_site/storage/images/in_country_support/other_countries__1/europe/russia/country_archive/09_19_06_russian/36290-1-eng-US/09_19_06_russian.jpg

    It’s a tourist attraction; it used to be a research ship:

    http://www.internationalliving.com/countries/other_countries__1/europe/russia/country_archive/09_19_06_russian

    “… the Kapitan Khlebnikov, a Russian ice breaker from Vladivostok, now under charter to Quark Expeditions…. (a purpose-built ice breaker, as opposed to merely having an ice-strengthened hull) was a working vessel before she became our cruise ship; cabins previously assigned to scientists and technical staff are now passenger cabins. …”

    Comment by Hank Roberts — 2 Jun 2008 @ 3:45 PM

  37. Thank you Gavin, very informative contribution. It also gives a new meaning to the exprssion “kick the bucket”. News that SST temperature readings are kaput because of differences in methods of drawing sea water samples are highly exaggerated, to paraphrase Mark Twain.

    Comment by Lawrence Brown — 2 Jun 2008 @ 4:04 PM

  38. John Lederer (32)

    Before WWII the Brits and others were good about measuring SST. They mostly used buckets. During WWII they had better things to do. Few SST measurements were taken during the war but the Yanks did most of them. They mostly used engine intake measurements that had a slightly higher bias. After the war others once again began to measure the SST using buckets. This caused the Yank’s (warmer) engine room temps to be “diluted” by the (cooler) bucket temps and caused a largely unexplained drop in (average) SSTs after the war. This paper explains the curious drop after the war.

    SSTs have continued to rise since the war, and surpassed war levels in the 70’s This paper mostly only addresses the (dip) artifact from 1945-1970.

    Does that help?

    Comment by Arch Stanton — 2 Jun 2008 @ 4:38 PM

  39. Arch Stanton,

    Thank you very much.

    But in the very recent past (say 1990- 2000) Igather measurements were largely water intakes?

    So after the immediate post WWII reversion to buckets, we changed again, presumably more gradually, to intakes.

    So if I look at a trend from say just after WWII (largely bucket) to around 2000 (largely intake), there will be .3 degrees (assuming that is the difference) of warming attributable not to an actual temperature change,but to the switch in measurement method?

    Or am I missing something?

    I

    Comment by John Lederer — 2 Jun 2008 @ 5:11 PM

  40. I’m really curious here as I’ve only read a few dozen of the climate-centric blogs, but I’m under the impression that only a small fraction are written by AGW “deniers” and perhaps 5% of the total would fall into the “skeptic” category? In terms of mainstream media it would seem that only a handful of stories (plus uninformed ranters at FOX) are critical of the basic AGW ideas held so dear here.

    Comment by Joseph Hunkins — 2 Jun 2008 @ 6:06 PM

  41. #18 Ike Solem, #27 Wayne Davidson, #25 Tenney Naumer …

    This might be an opportune time to remind everyone of how organized tobacco’s astroturf “sound science” movement morphed into attacks on science in other areas like global warming, as ably revealed by George Monbiot, among others: http://opinion-nation.blogspot.com/2008/06/sound-science-and-climate-change-or.html

    Whenever I read this stuff I marvel at how little pride some people have.

    Thanks again, RC, for providing some balance – and the ammo to fight back.

    Comment by Philip Machanick — 2 Jun 2008 @ 7:19 PM

  42. Joseph Hunkins, there are plenty of denialist sites on the Web. There are also sites that say the moon landing was a hoax and that allow posters to compare their anal-probing experiences by aliens. There are a few “skeptic” sites, but most of them are not run by scientists, and the “science” presented there is only of value in terms of entertainment. A very few actually raise some valid points, but grossly exaggerate their importance.
    In terms of mainstream media, the proportion of stories that question climate change or the “balance” of the stories (e.g. presenting “both sides”) is grossly out of proportion to the actual level of scientific consensus. And if you look at scientists who actually publish on climate in peer reviewed journals, there is pretty much universal agreement that CO2 has contributed significantly to recent warming and the vast majority say CO2 is responsible for the vast majority.
    There are many things we do not yet know about climate. The importance of CO2 is not among these things.

    Comment by Ray Ladbury — 2 Jun 2008 @ 7:31 PM

  43. Here’s a list that will include a good many:
    http://www.sourcewatch.org/index.php?title=Category:Front_groups

    +joeduck +climate

    Comment by Hank Roberts — 2 Jun 2008 @ 7:48 PM

  44. John Lederer,

    Ah, I think I understand your question now. It looks more like .4-.5C (but honestly I am extrapolating from a small graph. I will refer you back up to Gavin’s OP however and ask you to click on the second graph, this will take you to the Nature blog: ClimateFeedback. The first two line graphs are very interesting, although they are somewhat hard to see.

    “a” shows SSTs for the century; uncorrected on top, the bottom has had ENSO and COWL (Cold Oceans Warm Land) reconstructions subtracted from it. The dashed vertical line represents the drop after WWII the solid vertical lines represent major volcanic eruptions. It looks to me as though if the drop in temps in 1945 is .3C then the subsequent rise to today must be around .4-.5C (?)

    Comment by Arch Stanton — 2 Jun 2008 @ 9:07 PM

  45. Perhaps denialism is waning.

    Deep denial is rooted in human nature that wants look at the positive and ignore anything that is not causing real pain right now. AGW is uncomfortable news. No one wants to see it. But any science has to lay out all the models, scenarios, and systems analysis that will butt heads with economies, politics and will-power. The crux is an informed populace. Passive denial is like being asleep. Active denial is often traced to selfish monetary interests – carbon industries, etc. And so much recalcitrance is just momentum.

    How does one describe, label and then halt the AGW practices that are too dangerous for civilization? Science can only describe, label, predict and advise. Halting requires political involvement. And the lessons we failed to learn have been repeated. And will be repeated again.

    Sometimes I wish RC would be more political. Physics trumps all eventually, but politics and power rules humans – for now.

    Comment by Richard Pauli — 2 Jun 2008 @ 9:45 PM

  46. Speaking of the media: Charles Krauthammer wrote a run-of-the-mill denialist diatribe in the Washington Post last Friday in anticipation of the Lieberman-Warner Climate Security bill making it to the full Senate this week. I urge you to respond to that column with a letter to the editor, given this current visibility. And with the NASA IG’s report also in the news, the more and higher-profile NASA scientist signatories to that letter, the better.

    Comment by The Wonderer — 2 Jun 2008 @ 9:53 PM

  47. In his response to 23. Gavin argues “Either water vapour is stabilising or it isn’t (and it isn’t by the way…”

    [edit]

    “Cloud climate feedback constitutes the most important uncertainty in climate modelling, and currently even its sign is still unknown. In the recently published report of the intergovernmental panel on climate change (IPCC), 6 out of 20 climate models showed a positive and 14 a negative cloud radiative feedback in a doubled CO2 scenario.”

    Ref: Wagner, Thomas, S. Beirle, T. Deutschmann, M. Grzegorski, and U. Platt, 2008. Dependence of cloud properties derived from spectrally resolved visible satellite observations on surface temperature. Atmospheric Chemistry and Physics Vol. 8, No 9, pp. 2299-2312, May 5, 2008

    [Response: Perhaps if you read what I wrote we'd communicate better. Water vapour feedback is not the same as cloud feedback (clouds are condensate - not vapour). - gavin]

    Please see also:

    “By comparing the response of clouds and water vapor to ENSO forcing in nature with that in
    AMIP simulations by some leading climate models, an earlier evaluation of tropical cloud and water vapor feedbacks has revealed two common biases in the models: (1) an underestimate of the
    strength of the negative cloud albedo feedback and (2) an overestimate of the positive feedback
    from the greenhouse effect of water vapor. Extending the same analysis to the fully coupled
    simulations of these models as well as to other IPCC coupled models, we find that these two
    common biases persist.”

    Ref: Sun, De-Zheng, Yongqiang Yu, Tao Zhang, 2008. Tropical Water Vapor and Cloud Feedbacks in Climate Models: A Further Assessment Using Coupled Simulations. Submitted March 2008 to Journal of Climate

    Comment by Timo Hämeranta — 3 Jun 2008 @ 2:34 AM

  48. Re # 4 Alastair McDonald

    You claim

    “Wild fires, a symptom of desertification are being reported from Alaska to California, and from the north Mediterranean coast of Europe to the main populated regions of Australia”.

    You can leave my country, Australia, off the list. Please don’t invent to make a point. Stick to what you know is measured accurately, both as to desertification and fires, and quote your source.

    Comment by Geoff Sherrington — 3 Jun 2008 @ 6:41 AM

  49. Re # 12 Gavin’s response

    But Phil Jones knew SST work was wrong early in 2006 but failed to do much if anything about it by the time the IPCC published. Here is what he emailed to me:

    “Geoff,
    First, I’m attaching a paper. This shows that it is necessary
    to adjust the marine data (SSTs) for the change from buckets
    to engine intakes. If models are forced by SSTs (which is one
    way climate models can be run) then they estimate land
    temperatures which are too cool if the original bucket
    temps are used. The estimated land temps are much
    closer to those measured if the adjusted SSTs are used.
    Jonesandmoberg.pdf follandetal200.pdf ”

    So how can you justify writing that

    “Nick Rayner, Liz Kent, Phil Jones etc. are perfectly capable of working it out and I’d suggest deferring to their experience in these matters.”

    [edit]

    [Response: Possibly you did not read those papers. They show that the pre-1941 changes because of the Folland and Parker (1995) correction are in much better agreement than the raw data (ie. the correction seen in the first figure above). The changes we are talking about now are much smaller, but yes, they do know what they are doing. Read Rayner et al 2006 for instance. - gavin]

    Comment by Geoff Sherrington — 3 Jun 2008 @ 6:51 AM

  50. Re # 14 Gavin
    [Response: GISTEMP uses HadISST up to 1982 and Reynolds thereafter - there may still be issues in melding the two approaches because small potential offsets between the buoys and other methods of determining SST. - gavin]

    Is it so hard to use realspeak on realclimate and simply say “At least one is wrong”.

    [Response: Well down here in the reality-based community things don't come in neat packages labelled perfect or useless. Instead everything comes with uncertainties and we use all the available data to get closer to what was likely to be the truth. It's really not that hard a concept. -gavin]

    Comment by Geoff Sherrington — 3 Jun 2008 @ 6:53 AM

  51. Politics will always be the way we get things done when we work with people (gr=polis). However, politics can be motivated by either ignorance or science. Eventually, I don’t see any way around it but to build a smarter voter. If that means we have to take American Idol of the air, so be it.
    The current breed of denialists are not motivated by facts, so it will not matter how strong the case is for climate change. However, on a positive note, a recent poll revealed that most young women would rather talk to a guy at a party with the latest green car than to a guy with a sports car.

    Comment by Ray Ladbury — 3 Jun 2008 @ 7:24 AM

  52. Geoff,

    You wrote:
    You can leave my country, Australia, off the list. Please don’t invent to make a point. Stick to what you know is measured accurately, both as to desertification and fires, and quote your source.

    You may wish to declare your Australia a global warming free zone but …

    Understanding autumn rain decline in SE Australia http://www.csiro.au/news/UnderstandingDeclineAutumnRain.html

    “This is the worst bush fire conditions we have ever had in Victoria’s history …”
    http://www.sott.net/articles/show/125695-Wildfires+Burn+in+Southern+Australia

    Australia suffers worst drought in 1,000 years
    Depleted reservoirs, failed crops and arid farmland spark global warming tussle
    http://www.guardian.co.uk/world/2006/nov/08/australia.drought

    Cheers, Alastair.

    Comment by Alastair McDonald — 3 Jun 2008 @ 8:00 AM

  53. Roger Pielke Sr. chimes in –
    http://climatesci.org/2008/06/03/biased-view-of-the-global-average-temperauture-trend-data-at-real-climate/

    [Response: And another example of someone over-extrapolating from the Thompson et al paper in order to promote their pet hobbyhorse. Pretty much underlines the point of my post I think. - gavin]

    Comment by bob — 3 Jun 2008 @ 9:13 AM

  54. Geoff Sherrington (#48):

    Although desertification and fires are related in that they are both exacerbated by increasing average temperatures, I agree that they are not directly related (here in Australia at least) in the manner of fires causing deserts. Desertification on this continent is more the result of land clearance, broad-acre agriculture and grazing in arid and semi-arid regions.

    However, if you want information on the impact of climate change and associated temperature rise on fires, then this report by the CSIRO is a good place to start:

    Hennessy K, Lucas C, Nicholls N, Bathols J, Suppiah R, Ricketts J. 2006. Climate change impacts on fire-weather in south-east Australia. CSIRO, Australia. [PDF file, 2MB, 91 pages].

    Comment by Craig — 3 Jun 2008 @ 10:14 AM

  55. I’ve been watching these NESDIS arctic ice images:

    Here’s the residual ice at the end of last year’s melt season:

    By the start of this year, there was clearly multi-year ice north of Greenland and the Canadian archipelago, and thin ice everywhere else apart from in a narrow band stretching roughly from NE Greenland to Severnaya Zemlya:

    The transpolar drift has steadily moved that band, and much of the rest of the multi-year ice, towards the Fram Strait:

    Today the band is broken. Using this very imperfect observation, I suggest that for the first time in history there is no multi-year ice between the pole and the Fram Strait.

    To get a really clear impression of the process, I recommending opening each of these images in a separate browser tab and using Ctrl-Tab to cycle through them.

    Comment by Nick Barnes — 3 Jun 2008 @ 10:40 AM

  56. Argh. Links broken in my comment #55. Here are the images:

    End of last melt season:
    http://manati.orbit.nesdis.noaa.gov/ice_image21/2007/D07250.NHEAVEH.GIF
    Start of this year:
    http://manati.orbit.nesdis.noaa.gov/ice_image21/D08001.NHEAVEH.GIF
    Movement earlier this year:
    http://manati.orbit.nesdis.noaa.gov/ice_image21/D08050.NHEAVEH.GIF
    http://manati.orbit.nesdis.noaa.gov/ice_image21/D08100.NHEAVEH.GIF
    Today’s image:
    http://manati.orbit.nesdis.noaa.gov/ice_image21/D08154.NHEAVEH.GIF

    Comment by Nick Barnes — 3 Jun 2008 @ 12:45 PM

  57. RE: #30

    What type of thermometers were used for these measurements? I can’t imagine a merchant vessel using high-quality scientific types. How were these calibrated? Or are there special thermometers for marine service?

    Comment by Harold Pierce Jr — 3 Jun 2008 @ 2:38 PM

  58. Re # 57 Harold Pierce, Jr.

    Interesting question. I am curious to know how the Navy’s methods during war time (or even in rough seas) compare with the meticulous methods used by oceanographers (Woods Hole oceanographers, at least):

    Nansen-Bottle Stations at the Woods Hole Oceanographic Institution

    Corresponding author: Bruce A. Warren

    Abstract
    Nansen-bottle stations were occupied by ships and personnel of the Woods HoleOceanographic Institution from 1931 to about 1981. Most of these data are in archives, but usingthem intelligently to depict the state of the ocean and to assess time changes in it requiresknowing how the observations were made, what accuracies can be assigned to them, and generally how to approach them. This report describes the evolving methods on Woods Hole stations for measuring temperature, depth of observation, salinity, and dissolved-oxygen concentration, and for determining station position. Accuracies generally improved over time,although estimates from the early years are sparse, and even later there is indefiniteness.Analytical error is to be distinguished from sloppy sample collection and other blunders. The routine for carrying out Nansen-bottle stations, from the late 1950s through the 1970s, is reviewed.

    http://64.233.169.104/custom?q=cache:tW194mlD1s8J:https://darchive.mblwhoilibrary.org/bitstream/1912/1691/3/Nansen_Bottles.pdf+Nansen+Bottle&hl=en&ct=clnk&cd=13&gl=us&client=pub-2456819124576563

    Comment by Chuck Booth — 3 Jun 2008 @ 3:49 PM

  59. Gavin – Since you are willing to listen to those who have worked with the data, I invite you to answer the weblog on Climate Science today: http://climatesci.org/2008/06/03/biased-view-of-the-global-average-temperauture-trend-data-at-real-climate/

    I look forward to you showing that the claim that Real Climate is biased is incorrect.

    [Response: Roger, you are not going to be convinced until we agree with everything you think is important. Since that isn't going to happen, there doesn't seem to be much point in engaging. I'm perfectly willing to accept that you have different priorities and interests than us and would not dream of curtailing your right to blog about any of them. For instance, should I assert that you are biased because you haven't blogged about any of my recent papers? I think not. I would expect the same courtesy from you. - gavin]

    Comment by Roger A. Pielke Sr. — 3 Jun 2008 @ 4:00 PM

  60. RE: #58

    I once saw the term “Kriegsmarine effect”. Presumably it has something to do with the German Navy. Do you know anthing about this?

    There is a huge amount merchant shipping still discharging much oily bilge water into the open ocean. Oil on the surface of water would retard its evaporation. Cruise ship can’t do this anymore and have to filter the bilge water to remove oil before discharing it into the ocean.

    Comment by Harold Pierce Jr — 3 Jun 2008 @ 5:17 PM

  61. Confused by your response Gavin. You strongly imply that it is not responsible to say these data problems are a big deal. Pielke says the are a big deal, and he provides an analysis to suggest this accounts for a very significant amount of IPCC warming trend.

    If Pielke is right are you seriously saying that finding about SSTs would not be a big deal? Why not address the issue rather than spend all this time asserting it’s not relevant?

    [Response: You are confusing your Pielkes (it's easily done). Jr is the one with the trends based on nothing originally, and based on a little something later. Why I should waste time discussing the implications of hypotheses that are certainly incorrect is unclear. All I said above is that the corrections are likely to be limited to the immediate postwar period (based on the paper itself and discussions with people in a position to know). If so, this has little or no relevance to the detection or attribution of climate change and thus no importance to the main IPCC conclusions. Is it meaningful? Of course, but meaning is not limited to the IPCC AR4 SPM. I suggested a wait and see attitude is more appropriate - others appear to disagree. We'll see who is vindicated when the new analyses are done. - gavin]

    Comment by Joseph Hunkins — 3 Jun 2008 @ 5:30 PM

  62. Re #56: Nick, what’s the URL for selecting those images, and is there any sort of interpretation guide? Thanks.

    Comment by Steve Bloom — 3 Jun 2008 @ 5:34 PM

  63. Re # 60 Harold Pierce

    Kriegsmarine means “war navy”
    (http://en.wikipedia.org/wiki/Kriegsmarine). I don’t know what the Kriegsmarine effect is, but when I googled it I came up with this essay by AGW skeptic, Roy Spencer, of the National Space Science and Technology Center at University of Alabama, Huntsville:

    Spencer Part2: More CO2 Peculiarities – The C13/C12 Isotope Ratio

    http://wattsupwiththat.wordpress.com/2008/01/28/spencer-pt2-more-co2-peculiarities-the-c13c12-isotope-ratio/

    I couldn’t find any mention of Kriegsmarine effect, though I didn’t read it very carefully.

    By the way, Spencer’s first article on this subject (Part 1) was titled: Atmospheric CO2 Increases: Could the Ocean, Rather Than Mankind, Be the Reason?

    Comment by Chuck Booth — 3 Jun 2008 @ 6:03 PM

  64. Gavin:
    All I said above is that the corrections are likely to be limited to the immediate postwar period (based on the paper itself and discussions with people in a position to know). If so, this has little or no relevance to the detection or attribution of climate change and thus no importance to the main IPCC conclusions

    Thanks for this clarification. In fact befor I had misread Pielke Sr. post as talking about SST anomolies when in fact he was looking at other things when he suggests IPCC is running too hot.

    Comment by Joseph Hunkins — 3 Jun 2008 @ 6:15 PM

  65. Steve Bloom @ 62: The images are scatterometry, QuikSCAT daily ice images. You can pick the images yourself from the directory view:

    http://manati.orbit.nesdis.noaa.gov/ice_image21/

    e.g. D08154.NHEAVEH.GIF: year 08, day 154, northern hemisphere average.

    This page provides, and links to, more information about the instruments, data collection, and interpretation:

    http://manati.orbit.nesdis.noaa.gov/cgi-bin/qscat_ice.pl

    Comment by Nick Barnes — 3 Jun 2008 @ 6:56 PM

  66. > Kriegsmarine
    There’s a fellow who will be along any minute to fill you in on his theory. Or you’ll find it at a great many websites — oceanclimate.de and seaclimate.com among others.

    Comment by Hank Roberts — 3 Jun 2008 @ 8:19 PM

  67. Perhaps realclimate’s “bias” consists of reporting on what the papers actually say, rather than reporting on what they say and twisting it around to be some new attack on the IPCC and AGW…which the Pielkes seem to be joining the club for doing any chance they can get

    Perhaps the authors would not have written this in the abstract:

    //Corrections for the discontinuity are expected to alter the character of mid-twentieth century temperature variability but not estimates of the century-long trend in global-mean temperatures.//

    Comment by Chris Colose — 3 Jun 2008 @ 8:19 PM

  68. Re 47 Timo

    The Sun et al paper links the WV/cloud bias to ENSO rather than global warming. This is in their paper:

    //”These results suggest that at least in the models, the feedbacks of water vapor and clouds estimated
    from the ENSO cycle cannot be used as harbingers for the feedbacks of water vapor and clouds
    during global warming.”//

    Of course, as further evidence of “bias,” (RC’s of course) this didn’t stop Roger Pielke Sr. from declaring so confidently that

    //”This study indicates that the IPCC models are overpredicting global warming in response to positive radiative forcing.”//

    Silliness.

    Comment by Chris Colose — 3 Jun 2008 @ 8:32 PM

  69. Re 59 – Gavin, I just read Roger’s response. He is really an annoying prima donna. “Dance with me, or you are rejecting the art of dance.” He does not understand that he does not understand. To use the words of a recent Secretary of Defense, he seems to be plagued by unknown unknowns.

    Comment by Ron Taylor — 3 Jun 2008 @ 9:22 PM

  70. Re. 33 “Hank Roberts wrote: “The dry dock, with 15 people on board, was adrift in heavy seas for 2 days before being taken back under tow by the Admiral Makarov on Oct 16.”

    “and had trouble doing the job” (but is this in the NW passage itself or outside of it?) Is this your opinion or facts?

    Note that it might not have been in the NW passage itself that was so bad…the “adrift event” took place off Nova Scotia…which I believe is not part of the infamous north west passage and the Cabot passage “is part of an important shipping route”:

    “Cabot Strait, channel connecting the Gulf of St. Lawrence with the Atlantic Ocean, between Cape Breton Island and the island of Newfoundland, southeastern Canada. The channel is 110 km (70 mi) wide. It is part of an important shipping route. Ferries cross Cabot Strait, between Cape Breton Island and the island of Newfoundland. The channel is named after the Italian navigator John Cabot.”

    http://encarta.msn.com/encyclopedia_761564151/Cabot_Strait.html

    As Arch Stanton found:

    “This use of the Passage to
    avoid storms in the open ocean demonstrated
    its advantage for international
    shipping should the ice be reduced.
    The fact that the dry dock was
    then almost lost in a storm off Newfoundland
    seemed to confirm the benefits
    of sheltered waters of the Passage
    route.”

    http://www.navyleague.ca/eng/ma/papers/huebert_e.pdf

    So Hank, I wonder if the Northwest Passage itself is really so difficult to navigate…or is it instead the usual shipping lanes that are so hard to navigate in comparison?

    In other words, can you now just crank up an icebreaker and go through the NW passage and save ~3000 Km or 5000 miles while in a “calm NW passage”?

    Comment by Richard Ordway — 3 Jun 2008 @ 11:18 PM

  71. RE: #66

    Hank, thanks for the links. Absolutely fascinating. Do you think that the enormous amount maritime activity with modern ships bigger than most WW II battleships could be having an effect on the oceans or some sections of the oceans. The authors of these articles seem to think so.

    Their propellers of htese ships really churn up the sea. Essentially all of their heat energy is released into the ocean enviroment.

    Google “P&O Ventura” This ship is just huge and boogles the mind.

    Unlike naval vessels, merchant ships usually travel in well-defined sea lanes. Can the satellites see their effects? Know any links?

    Comment by Harold Pierce Jr — 4 Jun 2008 @ 1:39 AM

  72. Gavin

    Have you see this sentence at the end of the Thompson et al. article?

    “However, compensation for a different potential source of bias in SST data in the past decade the transition from ship to buoy-derived SSTs ,might increase the century-long trends by raising recent SSTs as much as 0.1 °C, as buoy-derived SSTs are biased cool relative to ship measurements”

    [Response: Yes. I mentioned it above. This would imply that the last few years in ths SST record is biased a little low - but as I said also, it's probably best to wait and see how this plays out when all the new corrections are made. - gavin]

    Comment by Pascal — 4 Jun 2008 @ 2:48 AM

  73. This bunch of sites promoting the naval war climate effect is probably the most bizarre stuff you can find on the entire web.
    Anyway – I used it as an excuse to call my grandfather who is well into his nineties and spent WWII on a german submarine as an engineer. He said they had very sophisticated temperature measuring equipment on these ships but of course they never used it to detect the temperature as an absolute but for calibrating various sonar sensors on weapons and the ship itself. He also mentioned that they kept detailed records about the exact depth of temperature layers. Furthermore there apparently were seperate measurements for engine inlet and ballast water temperature used to finetuning the ship’s engine performance and diving behaviour but he doubted they would be much use in research due to lack of absolute calibration and lack of records. When I told him, why I wanted to know all this stuff, he answered that there should be nobody better positioned to know the exact water temperatures at any depth in any part of the oceans than the post-war US and russian sub fleets. Is that true and if so – are these records available for research and already in use for climate science? Considering the degree of sophistication on modern nuclear subs, they should be the most reliable and most traveled water termometers out there – they’re certainly the most expensive.

    Comment by Henning — 4 Jun 2008 @ 3:47 AM

  74. Re 47 Timo

    maybe you are aware, that cloud types, properties, distribution, thickness, etc. in the tropics (that is, what the Sun paper is all about, see title) are different from those over the rest of the planet and might react different on global warming. ENSO forcing is not a very good analogon for global warming concerning clouds (as the authors assert).

    Comment by Urs Neu — 4 Jun 2008 @ 4:13 AM

  75. Harold Pierce writes:

    Do you think that the enormous amount maritime activity with modern ships bigger than most WW II battleships could be having an effect on the oceans or some sections of the oceans. The authors of these articles seem to think so.

    Their propellers of htese ships really churn up the sea. Essentially all of their heat energy is released into the ocean enviroment.

    Do the math, Harold. The amount released is trivial compared to the ocean’s heat content, or even the upper ocean’s heat content.

    Comment by Barton Paul Levenson — 4 Jun 2008 @ 6:11 AM

  76. Roger, in your post you complain, that RC has not commented on a specific paper.

    Since it is not possible to comment on any published paper, this complaint is strange.

    However, some short comments on your post http://climatesci.org/2008/01/30/a-serious-problem-with-the-use-of-a-global-average-surface-temperature-anomaly-to-diagnose-global-warming-part-ii/ for the readers here (comments on your site are closed), since it concerns my former research field:

    The post claims a significant bias in global temperature trends due to a warm bias in nocturnal minimum temperature trends due to the stabilisation (positive temperature with height) of the nocturnal boundary layer during calm nights.
    It presents a back-of-the envelope estimation of the possible influence of this bias since 1990.

    This estimation has serious problems, e.g.:
    1) The temperature trends presented in the IPCC report are based on monthly averages, which are calculated from continuous measurements for a lot of the stations, and not as assumed only from daily maximum and minimum temperatures, especially since 1990. Thus the influence of the daily minimum temperature is smaller than assumed.

    2) The discussed problem (the formation of a nocturnal boundary layer) over mid-latitudes only occurs for a number of nights during summer, i.e. for perhaps 30-60 days a year. For stations in high latitudes it will be less, for those in the subtropics it will be more. However, this reduces the effect by a factor of 5-10.

    3) The assumption, that in terrain with more relief this effect will be stronger, is strange. Relief causes winds in the discussed situations, which will reduce the effect. The effect is strongest in plains. Thus it concerns only part of the stations. This further reduces the effect.

    In addition, there are more fundamental questions (e.g. the trends are about surface temperatures, so why bother about different trends at other heights).

    Sorry, that I also have not the time to comment the paper more in depth.

    Comment by Urs Neu — 4 Jun 2008 @ 6:15 AM

  77. Harold Pierce #71:
    http://climateboy.blogspot.com/2007/10/shipping-lanes.html
    and links therein

    Comment by Martin Vermeer — 4 Jun 2008 @ 6:21 AM

  78. There must somewhere be explanation how eg GISS makes the global land & ocean temperature averages. Then it shouldn’t be a big deal to estimate how these bucket adjustments change the temperature series? The GISS is so often referred here in RC and by IPCC, which rely just on peer reviewed articles, that there must be a good description how the SST averages are calculated? And which stations are used, how station moves or changes are handled, how UHI effects are calculated and so on for the land stations. Any help?

    [Response: http://data.giss.nasa.gov/gistemp though the ocean temperatures are a blend of the HadISST product and the satellites from 1982 onwards (Reynolds and Smith). - gavin]

    Comment by andy — 4 Jun 2008 @ 6:57 AM

  79. #73
    *OT apologies*
    what a wonderful response from your grandfather – so acute and clear sounds like he is absolutely on the ball plus I would take his word (“is it true?”) over many others!!

    Comment by Alan K — 4 Jun 2008 @ 7:55 AM

  80. > 73, Henning’s grandfather on submarine data

    He’s right. Part of the US data set for the Arctic was declassified at the behest of then Senator Al Gore; the Navy has a large climate modeling program and publishes some work steadily. E.g.:

    … The area first declassified is called the “Gore box” ….
    http://www.realclimate.org/index.php/archives/2006/03/climate-
    http://www.realclimate.org/index.php/archives/2006/05/more-on-the-arctic/

    … bear in mind that Maslowski has access to data that none of the other modelers have (the stuff outside the “Gore box”). …
    fergusbrown.wordpress.com/2007/12/12/did-someone-stir-the-martini/

    Comment by Hank Roberts — 4 Jun 2008 @ 9:18 AM

  81. As a lay person (albeit with a Science degree) I find it interesting that the last 7 posts on this site have been disputing claims by Climate Change skeptics or data/studies that may/may not support their case. If the skeptics and their arguments are so off-the-wall why all the focus. “Doth the lady protest too much???”

    Comment by Patrick — 4 Jun 2008 @ 9:23 AM

  82. When discussing sea surface temperatures please see the following new study about sea level rise:

    Berge-Nguyen, M., A. Cazenave, A. Lombard, W. Llovel, J. Viarre, and J.F. Cretaux. 2008. Reconstruction of past decades sea level using thermosteric sea level, tide gauge, satellite altimetry and ocean reanalysis data. Global and Planetary Change Vol. 62, No 1-2, pp. 1–13, May 2008

    They reconstructed approximately 80 mm sea level rise over 1950–2003, i.e. a rise of 1.48 mm yr−1. This rise is below the estimate of the IPCC and have no acceleration.

    And after 2003 sea levels are falling…

    Comment by Timo Hämeranta — 4 Jun 2008 @ 10:05 AM

  83. Re 81. Gee, Patrick, why don’t you take that “science degree” off the shelf where it’s gathering dust and put it to some use learning the science yourself?
    Perhaps you have been unaware of the well funded disinformation campaign waged by coal and oil interests to discredit good science and heap calumny on good scientists. Perhaps you are not outraged when ignorant food tubes distort science to support their own pet theories. Perhaps you don’t feel that civilization is under threat in a society where more people believe in angels than believe in evolution. On second thought, go ahead and put that science degree back on the shelf. You’ll never need it.

    Comment by Ray Ladbury — 4 Jun 2008 @ 10:31 AM

  84. Urs – Thank you for taking the time to comment on the Climate Science weblog on the warm bias [http://climatesci.org/2008/01/30/a-serious-problem-with-the-use-of-a-global-average-surface-temperature-anomaly-to-diagnose-global-warming-part-ii/]. Here are the answers to your very good questions:

    1. The monthly mean is obtained as the average of each daily maximum and minimum temperature, so that the warm bias is not reduced by averaging.

    2. The nocturnal boundary layer occurs at all latitudes, of course, and in the higher latitudes persists 24 hours a day in the winter. At these high latitudes in the winter, where a significant wamring has been reported,the warm bias results in an overstatement of the warming.

    3. In complex terrain, the lower elevations (closed valleys, depressions, ect) have cool air pooling when the synoptic wind and large scale pressure gradient are weak. This results in the stable boundary layer being a common feature, particularly in the winter at middle and high latitudes (e.g. such as the Grand Valley of Colorado).

    The result of this warm bias is a significant overstating of the surface temperature warming.

    Comment by Roger A. Pielke Sr. — 4 Jun 2008 @ 10:57 AM

  85. Re 83. Ray,

    instead of yr political comments on scientists, I see different scientific views in all details of Climatology, worth debatable with scientific courtesy.

    In general, I see enormous amount of neutral, ‘mainstream’, alternative, critical and sceptical studies, normally funded and peer-reviewed.

    Media and blogosphere are sui generi.

    Comment by Timo Hämeranta — 4 Jun 2008 @ 11:18 AM

  86. “squishy science”

    Repeating what Gavin wrote “Every time there is a similar rush to judgment that is subsequently shown to be based on nothing, it still adds to the vast array of similar ‘evidence’ that keeps getting trotted out by the ill-informed.” …

    Repepeating what longtime CBS WCCO-TV meteorologist Mike Fairbourne said recently – in the 1970s “we were screaming about global cooling. It makes me nervous when we pin a few warm years on squishy science.”

    Fairbourne said he has talked “to a number of meteorologists who have similar opinions” as his, adding that he is concerned about “the extremism that is attached to the global warming.”

    I think we need to be concerned about where meteorologists, and many people who listen to them, are getting their educations about climate change and global warming from. NOAA’s NWES meteorologists have made light of and poked fun of global warming for more than a decade, and counting?

    http://www.startribune.com/nation/19095579.html?page=1&c=y

    Comment by pat n — 4 Jun 2008 @ 11:19 AM

  87. Re # 81 ” If the skeptics and their arguments are so off-the-wall why all the focus.”

    Well, people tend to argue with those with whom they disagree – if there is no disagreement, there is little to argue about (unless you want to nitpick). Now, if you are asking, Why not ignore the AGW skeptics/deniers? it is simple: They (some of them, at least) wield influence far in excess of their scientific credibility on the issue, and some of them are hell-bent on misleading the public, as Ray has pointed out.

    Comment by Chuck Booth — 4 Jun 2008 @ 11:39 AM

  88. #73

    Henning,

    I would like to meet your grandfather and swap a sea story or two. He is a rare survivor of some of the harshest naval warfare on both sides in history. His memory sounds right on the money to me. I spent 15 years as a Cold Warrior, slinking around in much greater comfort on US nuclear submarines and then another 15 training submariners, maintaining the ships and planning their operations.

    I cannot speak for the Soviet submarine data gathering practices but the US submariners logged everything. Strategic missile submarines put together extensive data packages after each approximately 60 day patrol. These included almost continuous comparisons of inertial velocity and electromagnetic log velocity. This gives a continuous measure of ocean current (set and drift). Daily XBT traces were taken which provided sound velocity profiles down to 1000 feet or more. At least daily traces of sound velocity from operating depths (a few hundred feet) to periscope depth were submitted. Continuous, highly accurate bottom contour traces were recorded. Sea water injection temperatures were logged hourly. All of this was coupled to very accurate continuous inertial navigation based positions.

    Polaris patrols started in 1960 with much the same data gathering that, I think, continues to this day on Trident submarines. Well over 3000 patrols have been conducted. A patrol of 60 days would have between 9000 and 10000 miles of continuous track over chunks of the Norwegian Sea, North Atlantic or Mediterranean about the size of Texas. Similar patrols were conducted in the Pacific from 1964 on. Each data package was submitted for analysis to the Applied Physics Laboratory at Johns Hopkins University. The bottom line for the analysis was whether or not we had maintained continuous target coverage and the ability to launch within national command authority guidelines. Later the data packages were extended to cover more sonar system related data and used to develop better sonar and also confirm that we were not being followed by the “other guys”.

    The data reduction effort would be enormous. I think the effort could provide good temperature data over an almost 50 year period and down to depths of at least 100 meters. The ocean current data is unique and should be a valuable input to ocean modeling. Best of all for me would be knowing we contributed to something beyond mutually assured destruction.

    Declassifying this data will require strong political leadership—at least as strong as provided by Mr. Gore in breaking loose the Arctic ice data. I can think of no one better to take on this task then my own Representative Jay Inslee (D-Washington). He already has a stellar record on energy and climate.

    Comment by Paul Middents — 4 Jun 2008 @ 12:04 PM

  89. Timo,
    Different scientific views? Exactly where in the peer-reviewed literature is there any contention that CO2 sensitivity is significantly less than 3 degrees per doubling (other than, say the ill-fated attempt of a certain scientist from BNL).

    Yes, there are areas where uncertainty remains. The role of CO2 is not among these. If you contend otherwise, produce the peer-reviewed literature backing your position.

    Comment by Ray Ladbury — 4 Jun 2008 @ 1:14 PM

  90. Re 87 Chuck,

    Science is not about credibility, but facts. When Climatology is mostly probabilities, there’s always room to debate.

    Everyone, whether neutral, ‘mainstream’, alternative, critical or sceptical, who proclaims certainty, only proves poor knowledge, false confidence and lack of scientific understanding (& behavior).

    Comment by Timo Hämeranta — 4 Jun 2008 @ 1:24 PM

  91. #82, Timo, I can’t find that paper, but here are some related charts that show sea level rises overall, but thermal contraction since 2003, perhaps that’s what you were referring to?

    http://www.aviso.oceanobs.com/en/news/ocean-indicators/mean-sea-level/science-questions/index.html

    Comment by Eric (skeptic) — 4 Jun 2008 @ 1:33 PM

  92. Yesterday’s Science Times section of the NY Times had a brief piece on this topic. http://www.nytimes.com/2008/06/03/science/03obtemp.html?ref=science

    Also pack up all your cares and woes if you’re concerned about the future of renewables, especially solar power, the same section contains an article in which Dr. Ray Kurzweil says in part:
    ” Worried about greenhouse gas emissions? Have faith. Solar power may look terribly uneconomical at the moment, but with the exponential progress being made in nanoengineering, Dr. Kurzweil calculates that it’ll be cost-competitive with fossil fuels in just five years, and that within 20 years all our energy will come from clean sources.”
    http://www.nytimes.com/2008/06/03/science/03tier.html?_r=1&ref=science&oref=slogin

    Comment by Lawrence Brown — 4 Jun 2008 @ 2:01 PM

  93. Re 89. Ray,

    about climate sensitivity please see e.g.

    1. Kiehl, Jeffrey T., and Christine A. Shields, 2007. Inter-model climate sensitivity. NCAR Climate Change Prediction Program study 2007, online http://www.cgd.ucar.edu/ccr/shields/posters/ccppsensitivity.pdf

    2. Kiehl, Jeffrey T., 2007. Twentieth century climate model response and climate sensitivity. Geophys. Res. Lett., 34, L22710, doi:10.1029/2007GL031383, November 28, 2007

    Abstract

    Climate forcing and climate sensitivity are two key factors in understanding Earth’s climate. There is considerable interest in decreasing our uncertainty in climate sensitivity. This study explores the role of these two factors in climate simulations of the 20th century. It is found that the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity. Much of the uncertainty in total anthropogenic forcing derives from a threefold range of uncertainty in the aerosol forcing used (TH = tuned) in the simulations.

    [edit - FUD removed]

    5. Seiffert, Rita, and Jin-Song von Storch, 2008. Impact of atmospheric small-scale fluctuations on climate sensitivity. Geophys. Res. Lett., 35, L10704, doi:10.1029/2008GL033483, May 21, 2008

    Abstract

    Climate change scenarios are based on numerical models with finite spatial and temporal resolutions. The impact of unresolved processes is parameterized without taking the variability induced by subscale processes into account. This drawback could lead to an over-/underestimation of the climate sensitivity. The aim of this study is to investigate the impact of small-scale atmospheric fluctuations on the modeled climate sensitivity to increased CO2 concentration. Using a complex coupled atmosphere-ocean general circulation model (ECHAM5/MPI-OM) climate response experiments with enhanced small-scale fluctuations are performed. Our results show that the strength of the global warming due to a CO2 doubling depends on the representation of small-scale fluctuations. Reducing the horizontal diffusion by a factor of 3 leads to an increase of the equilibrium climate sensitivity by 13%. If white noise is added to the small scales, the climate sensitivity tends to weaken. The largest changes in responses occur in the upper troposphere.

    [edit - irrelevant ref removed]

    Comment by Timo Hämeranta — 4 Jun 2008 @ 2:19 PM

  94. Not trying to hijack this thread:

    “Our investigation found that during the fall of 2004 through early 2006, the NASA Headquarters Office of Public Affairs managed the topic of climate change in a manner that reduced, marginalized, or mischaracterized climate change science made available to the general public through those particular media over which the Office of Public Affairs had control (i.e., news releases and media access).

    NASA internal review June 2008

    “The report found credence in allegations that National Public Radio was denied access to top global warming scientist James Hansen. It also found evidence that NASA headquarters press officials canceled a press conference on a mission monitoring ozone pollution and global warming because it was too close to the 2004 presidential election.”

    http://www.hq.nasa.gov/office/oig/hq/audits/reports/FY08/IG-08-017.pdf

    Comment by Richard Ordway — 4 Jun 2008 @ 2:20 PM

  95. #86 warm Northern re-greetings Pat, Being of Scottish ancestry, I rather go down fighting the good fight, than sit down and watch the show go bad, complacency rules the world, even people in key science positions follow the business as usual flow, but it does not mean we all have to agree to do nothing. Ya TV met guys have for the most part been horrendous on the subject for the longest time. They can’t distinguish the difference between climate and weather forecasting. We got a CBC weather woman who isn’t so bad, try watching CBC National news if you have access.

    #84 Roger, Nice arguments, they are trivial at best, of no real consequence aside from flaunting semantics. Why don’t you do a weighted temperature of your Colorado Upper Air Radiosondes, Over the last 3 months, and see if there is a noticeable variation? Just wondering if you have a bit of access, and a bit of curiosity? You may be surprised by what you find.

    Comment by Wayne Davidson — 4 Jun 2008 @ 4:19 PM

  96. Thanks so much for the fascinating Maritime Archeo-Climatology.

    This triggers a thought that there must be a huge amount of data in airline pilot reports. I know that every aircraft has a high quality thermometer or even two or more. And at any moment, there are thousands of aircraft in flight somewhere globally. Sometimes pilots report weather info. With aircraft transponders reporting other data, Shouldn’t all aircraft report temperature too? Seems like an easy way to collect data. And there might be lots of data to mine.

    Comment by Richard Pauli — 4 Jun 2008 @ 7:11 PM

  97. > Aircraft

    Sounds like special instruments would be preferable, avoiding people blogging about how the Boeing 737 thermometers were known biased and the Airbus temperature gauges were placed downwind of the engines….

    But it’s being tried:

    “… Moffett Field isn’t generally open to civilian aviation. To get access, Google’s founders, Larry Page and Sergey Brin, along with CEO Eric Schmidt, agreed to pay $1.3 million annually to NASA through a holding company they control and to carry scientific instruments aboard their aircraft….”
    http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2007/11/15/BU85SK0QM.DTL

    I imagine the radio call.

    “Hello, Sergei? Moffett here. We’re diverting your Hawaii vacation flight by a couple hundred miles to get better data on that hurricane offshore of Baja before it hits San Diego. Yes, hurricane. No, you’ll be fine. Fasten your seat belts, this may be a bit bumpy …”

    Comment by Hank Roberts — 4 Jun 2008 @ 8:31 PM

  98. Or imagine hundreds of North Atlantic or polar flights — at very specific altitudes and routes – and altimeters are pretty accurate. How about 50 years of temperature data at 30,000 feet? Different planes and different hardware is similar to the bucket problem above. The data may be there in the past, certainly so in the future. Looks like a straight forward study.

    Comment by Richard Pauli — 4 Jun 2008 @ 10:21 PM

  99. Re # 90 Timo

    Most scientific “facts” are accepted as such based on probability – that’s the reason scientists perform, and report, statistical analysis of their data. And virtually all scientific conclusions are debatable (though some are so thoroughly confirmed that further debate is of little value). In this regard, climatology is no different from any other field of science. Surely you know all of this?

    As for credibility, in order to avoid having to repeat every scientific study myself, I have to rely on research conducted by others in order to advance my knowledge. So, I have to mentally assign a degree of credibility to the papers I read, and to the authors of those papers. I’m much more willing to accept the findings of well-credentialed authors publishing in reputable, peer-reviewed journals. For the same reason, I am inclined to accept the information reported at RC based on the scientific reputations of the contributors. I know full well,however, that any given statement, or any given conclusion, by any of the RC contributors could be partially or completely wrong. But, I have far more confidence in their statements than I do of many of the authors at other climate science websites. Roy Spencer, S. Fred Singer, Roger A Pielke, Sr., and others may well make valid contributions to the field of climatology, but I am inclined to view their statements with skepticism, at least until I see their views confirmed in the peer-reviewed literature. Though, S.Fred Singer’s assertion in an ABC news interview that it never occurs to him that he might be wrong on AGW convinced me that he merits no credibility as a scientist.

    If you have followed the AGW debate in the blogosphere and in the popular press, you are well aware that there is a lot of misinformation being published, some of it intentionally. Nationally (U.S.) syndicated columnists George Will and Charles Krauthammer both had op-ed essays published during the past week, in which they continued to advance their skepticism about AGW – they clearly are not getting their “facts” from the peer-reviewed climatology literature, and they are clearly not reading RC. So, the question is, where are they getting their information? Very likely it is the small number of AGW skeptics with science Ph.D.s whose views receive widespread attention in the media and blogosphere but somehow don’t make it into the peer-reviewed literature (or, what they publish in the peer-reviewed literature is not what gains them a following of AGW skeptics and deniers).

    It is those authors, whose influence in the blogosphere and mass media outweighs their scientific credentials (and publication record) in the field of climatalogy, that are the subject of many of the RC posts – the RC contributors are trying to set the record straight. Patrick’s question(# 81) is either disingenuous or he is remarkably ill-informed.

    Comment by Chuck Booth — 4 Jun 2008 @ 10:22 PM

  100. Re: #82 on sea level rise.

    That paper doesn’t appear to be available (for free) on line, but the abstract is. See the following link:

    http://www.science-direct.com/science?_ob=ArticleURL&_udi=B6VF0-4RC2NPR-1&_user=10&_coverDate=05%2F31%2F2008&_rdoc=1&_fmt=high&_orig=browse&_srch=doc-info(%23toc%235996%232008%23999379998%23687592%23FLA%23display%23Volume)&_cdi=5996&_sort=d&_docanchor=&_ct=13&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=2116d200a8f96f26d3a37d50865c3061

    The abstract states:

    “This study investigates past sea level reconstruction (over 1950–2003) based on tide gauge records and EOF spatial patterns from different 2-D fields. In a first step, we test the influence on the reconstructed signal of the 2-D fields temporal coverage. For that purpose we use global grids of thermosteric sea level data, available over 1950–2003. Different time spans (in the range 10–50 yr) for the EOF spatial patterns, and different geographical distributions for the 1-D thermosteric sea level time series (interpolated at specific locations from the 2-D grids), are successively used to reconstruct the 54-year long thermosteric sea level signal. In each case we compare the reconstructed trend map with the reference. The simulation indicates that the longer the time span covered by the spatial EOFs, the closer to the reference the reconstructed thermosteric sea level trends. In a second step, we apply the method to reconstructing 2-D sea level data over 1950–2003, combining sparse tide gauge records available since 1950, with EOF spatial patterns from different sources: (1) thermosteric sea level grids over 1955–2003, (2) sea level grids from Topex/Poseidon satellite altimetry over 1993–2003, and (3) dynamic height grids from the SODA reanalysis over 1958–2001. The reconstructed global mean sea level trend based on thermosteric EOFs (case 1) is significantly lower than the observed trend, while the interannual/decadal sea level fluctuations are well reproduced. Case 2 (Topex/Poseidon EOFs over 1993–2003) leads to a global mean sea level trend over the 54-year time interval very close to the observed trend. But the spatial trends of the reconstruction over 1950–2003 are significantly different from those obtained with thermosteric EOFs. Case 3 (SODA EOFs over 1958–2001) provides a reconstruction trend map over 1950–2003 that differs significantly from the previous two cases. We discuss possible causes for such differences. For the three cases, on the other hand, reconstructed spatial trends over 1993–2003 agree well with the regional sea level trends observed by Topex/Poseidon.”

    Sea level is clearly not decreasing since 2003, as you can see at the following link:

    http://www.cmar.csiro.au/sealevel/sl_hist_last_15.html

    From the graph at the above site, it’s apparent that sea level fell a bit during the 2007 La Nina, but it’s clearly on rise again. The trend is 3.3 mm/year, about double the 20th century trend and clearly an acceleration over the past 50 years.

    Comment by Ken Feldman — 4 Jun 2008 @ 11:30 PM

  101. Re 84 Roger, some more comments (then I’ll stop, since it’s not the topic here):
    - In wintertime when there is no wind, there is often fog, at least in many places. And in fog layers you will not find temperature inversions as in the nocturnal boundary layer.
    - In valleys you often have slope or valley winds during night, which are still cool, but are mixing a layer which often is higher than 10 meters. And on coasts or near lakes you have also local wind systems. These situations are not comparable to open plains.
    - Stability in the nocturnal boundary layer is on the one hand influenced by wind speed, but also by clouds. Trends in stability are therefore influenced by trends in wind speed and clouds, and therefore by trends in the frequency distribution of distinct synoptic situations. Since there is considerable regional decadal variations in synoptic situations, a regional trend over a decade might be influenced by regional synoptic decadal variability. So before extrapolating over the globe, you have to assess such trends.
    - Stability is influenced by longwave downward radiation. Especially in calm clear sky conditions, where you have a full built nocturnal boundary layer, a decrease in stability is exactly what can be expected, if the concentration of greenhouse gases rises (enhanced longwave downward radiation, partly offsetting the cooling at the surface). Thus, if it is not a regional synoptic effect, the decrease in stability and the ‘enhanced’ surface warming might well be the result of increased greenhouse gas concentration and represent global warming. If greenhouse gases warm the air at 10m height somewhat less than at 2m, this does not help much, since humans, vegetation, soil, etc. is more affected by temperatures at 2m than by those at 10m.

    Comment by Urs Neu — 5 Jun 2008 @ 2:55 AM

  102. “Sounds like special instruments would be preferable, avoiding people blogging about how the Boeing 737 thermometers were known biased and the Airbus temperature gauges were placed downwind of the engines….”

    Just as long as they’re not using canvas buckets…

    Comment by Adam — 5 Jun 2008 @ 4:25 AM

  103. RE 96: Factually, commercial passenger aircaraft have been used for collection of near real time meteorological data since the 1970′s, first on experimental basis and for research purposes and now in a fully operational manner. The system development was rather slow, as finding transmission slots in radio channels and related standardization took its time. Also the issue of cost sharing had to be addressed.

    Now many aircraft routinely report temperature and wind readings, and humidity mesurements are being added to the programs. In many areas most of the in-situ ujpper-air measurement data is generated from these aircraft reports. Adding measurements of ozone and other atmospheric composition parameters is in the works.

    Pilots did provide weather reports on written de-briefing forms even earlier, but these seldom included detailed data such as temperature readings.

    Comment by Pekka Kostamo — 5 Jun 2008 @ 5:58 AM

  104. Timo Hämeranta says: “Science is not about credibility, but facts. When Climatology is mostly probabilities, there’s always room to debate.”

    Oh, please. Anybody who has ever actually done science knows that over time you get to know the contributors to a field–and guess what, everybody has a pet theory, a personal rivalry or an axe to grind. Those who establish themselves as credible are those who demonstrate willingness to put these biases aside and do good science. On the other hand, if a scientist goes to the media and misrepresents the state of the art, that is the surest way to destroy his or her credibility.
    This does not mean that every technical contribution they make will be dismissed or even that it will be more heavily scrutinized, but everything they do is viewed with their agenda in mind.
    Chuck has already pointed out the probabilistic nature of “scientific fact”. Along these lines I recommend the attached essay (as I’ve done many, many times):
    http://ptonline.aip.org/journals/doc/PHTOAD-ft/vol_60/iss_1/8_1.shtml

    As to the references you’ve cited–it is nearly impossible to come up with a generally applicable GCM with a sensitivity much less than about 2 degrees per doubling, and 3 degrees is strongly favored by a range of independent lines of evidence. Moreover, the evidence suggests that if 3 degrees is wrong, it’s more likely too low than too high.

    Comment by Ray Ladbury — 5 Jun 2008 @ 9:08 AM

  105. Pekka’s right about commercial aircraft reports:
    http://amdar.noaa.gov/docs/bams/
    February 2003 Bulletin,
    American Meteorological Society, 84, pp 203-216
    Automated Meteorological Reports from Commercial Aircraft

    ABSTRACT excerpt follows

    Commercial aircraft now provide over 130,000 meteorological observations per day, including temperature, winds, and in some cases humidity, vertical wind gust, or eddy dissipation rate (turbulence). The temperature and wind data are used in most operational numerical weather prediction models at the National Centers for Environmental Prediction (NCEP) and at other centers worldwide. At non-synoptic times, these data are often the primary source of upper air information over the U.S. Even at synoptic times, these data are critical in depicting the atmosphere along oceanic air routes.

    A web site (http://acweb.fsl.noaa.gov/) has been developed that gives selected users access to these data. Because the data are proprietary to the airlines, real-time access is restricted to entities such as government agencies and nonprofit research institutions ….

    Comment by Hank Roberts — 5 Jun 2008 @ 9:50 AM

  106. I realize this is OT and that climate is an average over 10 years or so, but can’t help to just compare the extent of ice and snow from last year to this.

    http://igloo.atmos.uiuc.edu/cgi-bin/test/print.sh?fm=06&fd=04&fy=2007&sm=06&sd=04&sy=2008

    In summary, the current snow cover in the NH is signficantly less than last year, which as many on this thread know produced record minimums of sea ice extent.

    So, with the albedo of the NH so low, it’s looking like a another record summer for sea ice.

    Comment by Andrew — 5 Jun 2008 @ 11:13 AM

  107. Most of the ACWEB site is restricted, as it says in the AMDAR article linked above, but poking at it, some information is public; here’s an online course in the operation of the program, which may interest some.
    http://amdar.noaa.gov/2007course/

    Thanks again, Pekka, for pointing out this exists.
    More evidence that Google lacks a Wisdom button; I continue to rely on others to know what to look for!

    Comment by Hank Roberts — 5 Jun 2008 @ 12:41 PM

  108. Re #90 …

    However, obviously it is incorrect to say, as Tim H. said, that “Climatology is mostly probabilities”…

    Insteady, when climate change is happening, climatology is or should be mostly about measuring and seeing trends in climate data which began years ago and are continuing and/or increasing.

    Re #95 – Hi Wayne.

    Comment by pat n — 5 Jun 2008 @ 2:05 PM

  109. “At non-synoptic times, these data are often the primary source of upper air information over the U.S. Even at synoptic times, these data are critical in depicting the atmosphere along oceanic air routes.”

    This information is very useful to the European models as it’s a very good source of upper air data over the Atlantic.

    Comment by Adam — 5 Jun 2008 @ 2:38 PM

  110. Hi Andrew,

    Re #106 “that climate is an average over 10 years or so.”

    Not to be picky, but I believe that the IPCC is more comfortable with the term “mininum climate” being much longer-term averages than just ten years: -about 30-50 years needed at mininum to filter out noise and natural cycles:

    “The classical period is 30 years, as defined by the World Meteorological Organization (WMO).”

    “Climate in a narrow sense is usually defined as the “average weather”, or more rigorously, as the statistical description in terms of the mean and variability of relevant quantities over a period of time ranging from months to thousands or millions of years. The classical period is 30 years, as defined by the World Meteorological Organization (WMO). These quantities are most often surface variables such as temperature, precipitation, and wind. Climate in a wider sense is the state, including a statistical description, of the climate system. (IPCC) glossary

    http://www.grida.no/climate/ipcc_tar/wg1/518.htm

    Although ten years versus 30-50 does not sound like much difference, some natural repeating cycles are thought to be more than ten years long in duration such as the Pacific Decadal Oscillation (~22 years or so). So if climate were only ten years long, you wouldn’t be able to filter out noise like this would you?

    Comment by Richard Ordway — 5 Jun 2008 @ 3:32 PM

  111. WAit a minute- could we not use the aircraft temperature readings to cross check with the radiosondes and thence the satellite readings?
    Of course it would depend on the reliability and reproducibility of the aircraft method, its probably got some faults in it, but nevertheless, I wonder if it could help.

    Comment by guthrie — 5 Jun 2008 @ 4:56 PM

  112. Re #6 and 16
    My understanding is that humans produce about 4% of the CO2 released into the atmosphere annually. Is there a reference for the 27% that you stated? Thanks

    Comment by Gary — 5 Jun 2008 @ 7:59 PM

  113. Ray Ladbury: …it is nearly impossible to come up with a generally applicable GCM with a sensitivity much less than about 2 degrees per doubling…

    As someone familiar with computer models, this seems unlikely to me. Do you have a reference?

    Ray: Moreover, the evidence suggests that if 3 degrees is wrong, it’s more likely too low than too high.

    That seems in conflict with James Annan (to whom you have previously linked) in this paper:

    http://www.jamstec.go.jp/frcgc/research/d5/jdannan/prob.pdf

    Comment by Steve Reynolds — 5 Jun 2008 @ 8:17 PM

  114. re 112

    //”Re #6 and 16
    My understanding is that humans produce about 4% of the CO2 released into the atmosphere annually. Is there a reference for the 27% that you stated? Thanks”//

    I think you’re confusing emissions and concentrations…the latter is what matters for the greenhouse effect. CO2 has changed from 280 ppmv to 384 ppmv so about 30% of the CO2 in the air is anthropogenic. The 4% number sounds about right for annual emissions (compared to all sources), but not incredibly relevant since CO2 is long-lived and forcing is cumalative.

    Comment by Chris Colose — 5 Jun 2008 @ 9:14 PM

  115. Re 113

    In fact, he’s right. The range is most likely within 2 to 4.5 C (IPCC 2007) , and the paleo record in particular argues against low sensitivity…but if anything we cannot discount the long tail that exists at the high end (i.e., low-probability, high consequence). See Roe and Baker 2007 in Science.

    Comment by Chris Colose — 5 Jun 2008 @ 9:25 PM

  116. > aircraft temperature
    You may suspect they’re doing that already, given the existence of that large website dedicated to applying the data — but note the altitudes reached by sonde balloons: “The balloon carries the instrument package to an altitude of approximately 25 mi (27-37 km)….” http://www.aos.wisc.edu/~hopkins/wx-inst/wxi-raob.htm

    Comment by Hank Roberts — 5 Jun 2008 @ 10:03 PM

  117. Re Comment by Fred Moolten — 2 June 2008 @ 11:51 AM

    Like you I’ve noticed that there seems to be a concentration on the fall-off and an ignoring of the rise. You attribute the WWII temperature excursion to an ENSO event. Isn’t it rather odd for such a feature? If you look at the Hadley SST graphs without the major Folland and Parker correction — Folland and Parker [1995] Figure 3. Annual anomalies from a 1951-80 average of uncorrected SST (solid) and corrected NMAT (dashed) for (a) northern (b) southern hemisphere, 1856-1992. Only collocated 5 deg. x 5 deg. SST and NMAT values were used — the strangeness of the temperature excursion stands out like the proverbial dog’s whatsits, larger than anything else on the graph, larger even than the huge 1998 surge which got us all so worried. There is a rise of about .4 deg in a year, sustained for several years.

    I haven’t read the paper with the latest correction, but I can’t understand why there’s a correction afterwards and not before the excursion, unless it’s because someone hasn’t noticed that the F&P adjustment hides a rise which is even more striking than the fall. If the procedure used is just to strip out temperature blips on the basis that they are merely ENSO events then the chance to find an explanation for this interesting anomaly will be missed.

    BTW, nice blues at your site — thanks.

    JF

    Comment by Julian Flood — 6 Jun 2008 @ 4:03 AM

  118. Andrew writes:

    I realize this is OT and that climate is an average over 10 years or so, but can’t help to just compare the extent of ice and snow from last year to this.

    The World Meteorological Organization defines climate as mean regional or global weather over a period of 30 years or more.

    Comment by Barton Paul Levenson — 6 Jun 2008 @ 6:08 AM

  119. Re #112 Gary

    In relation to [CO2] in the atmosphere we need to be clear what we are talking about!

    The parameter of interest is the increase in atmospheric CO2 as time progresses.

    So for example for the around 2000 years before the mid 19th century, atmospheric CO2 levels were near 280 ppm [***]. That’s despite the massive release of CO2 into the atmosphere by plant decay in the Northern hemisphere in late summer/fall. This massive release of CO2 is reabsorbed by the terrestrial environment due to plant regrowth in the Northern hemisphere Spring and early summer.

    This is overall pretty much nett neutral with respect to year on year changes in atmospheric [CO2]. That’s rather clear from the atmospheric CO2 record of the approx 2000 years before the industrial era [***].

    Pretty much all of the excess and increasing [CO2] resulting from excess release of CO2 over and above the plant growth/decay cycles is from human emissions (digging up and burning fossil fuels sequestered underground for many millions of years). This has resulted in an increase in atmospheric [CO2] from 280 ppm to 385 ppm now (and growing).

    The proportion of the total atmospheric [CO2] resulting from human emissions is (385-280)/385 = 27%

    The percentage increase in atmospheric [CO2] due to human emissions is (385-280)/280 = 37.5% (in other words we have increased atmospheric CO2 levels by 37.5% over preindustrial levels).

    Note that only about 40-50% of our emissions have ended up in the atmosphere. Around 40-50% has been absorbed by the oceans and a small amount has been absorbed by the terrestrial environment

    [***]Meure CM et al (2006) Law Dome CO2, CH4 and N2O ice core records extended to 2000 years BP Geophys. Res. Lett. 33 L14810

    Comment by Chris — 6 Jun 2008 @ 6:10 AM

  120. Gary says:

    My understanding is that humans produce about 4% of the CO2 released into the atmosphere annually. Is there a reference for the 27% that you stated? Thanks

    It’s actually less than 1% that’s released every year. But the vast natural sources are balanced by vast natural sinks, which is why atmospheric CO2 was steady for thousands of years at around 280 parts per million by volume. Since about 1750, human technology has increased the amount coming in, so that CO2 is now at about 385 ppmv. (385 – 280) / 385 = 27% of the CO2 out there is of human technological origin.

    Spam filter note — I kept talking about the a m b i e n t level of CO2, and my posts kept getting flagged as spam because a m b i e n is a drug.

    Comment by Barton Paul Levenson — 6 Jun 2008 @ 6:15 AM

  121. Barton (120), I’m flabbergasted with the numbers and granularity of CO2 absorption they suggest. I may also just be ignorant, but while I’m thinking: are you saying that the natural balance of CO2 in the atmosphere (source = sink) is so finely tuned and sensitive that its sink can not handle any of the 0.000000062% [6.2x10 -8 %] of its weight added in an average year?? Also, are our physics and math that exact? [Or..., is my arithmetic faulty?]

    Comment by Rod B — 6 Jun 2008 @ 1:15 PM

  122. Apologies but there’s nowhere I could find (before I got bored.. :-)) to ask. Anyone know who Steven Goddard is?

    He’s put a couple of things up on The Register about how the NASA figures are fiddled so that NASA can continue the “lie” about AGW. The latest one is here:

    http://www.theregister.co.uk/2008/06/05/goddard_nasa_thermometer/

    There’s plenty of holes but getting them pointed out to him is impossible: too many posts go “disappearing” by moderation. The last one was about how the UAH and RSS figures were the second coldest southern hemisphere temperatures and only a little bit away from the world average.

    However, I think he’s not reading correctly, or at the very least assuming a meaning to the report that isn’t actually there.

    Ta.

    Comment by mark — 6 Jun 2008 @ 2:03 PM

  123. #113 Steve Reynolds:

    Ray Ladbury: … it is nearly impossible
    to come up with a generally applicable
    GCM with a sensitivity much less than
    about 2 degrees per doubling…
    As someone familiar with computer
    models, this seems unlikely to me. Do
    you have a reference?

    The IPCC AR4 WG1 report, page 630.

    “The current generation of GCMs covers a range of equilibrium climate sensitivity from 2.1C to 4.4C (with a mean value of 3.2C, see Table 8.2 and Box 10.2), which is quite similar to the TAR”.

    Table 8.2 tells that this is based on 19 model values.

    So, getting well below 2C seems to be a bit of an effort, probably requiring you to do something wrong :-)

    Comment by Martin Vermeer — 6 Jun 2008 @ 2:03 PM

  124. Rod B asks: “may also just be ignorant, but while I’m thinking: are you saying that the natural balance of CO2 in the atmosphere (source = sink) is so finely tuned and sensitive that its sink can not handle any of the 0.000000062% [6.2×10 -8 %] of its weight added in an average year??”

    Well, where would the 9 gigatons of trees and plants each year be growing? We can’t see them from the sattelite.

    Maybe they’re growing underground…

    And is that ~10^-8 % based on total mass of the atmosphere? How does having nitrogen in the air help carbon absorption?

    Comment by mark — 6 Jun 2008 @ 2:06 PM

  125. Rod B.,
    Look here:
    http://www.epa.gov/climatechange/emissions/co2_human.html

    and here:
    http://en.wikipedia.org/wiki/Carbon_cycle

    Then check your math.

    Comment by Ray Ladbury — 6 Jun 2008 @ 2:08 PM

  126. Off topic. But does anyone know if Wally Broeker’s “greenhouse scrubbing” Co2-absorbing plastic “tree” project is feasible or have more information about it?

    He and others have proposed needing ~60 million of these plastic towers which would be ~50 feet high (~14 meters high). The plastic absorbs CO2. Then the “full” or CO2 saturated trees would then be turned into a solid or compressed and stored underground.

    Broeker points out that humanity can produce 55 million automobiles every year so the scope is possible especially if spread out over a 30-40 year period.

    His calculations apparently work out to needing ~60 million trees to capture “all of the CO2 currently emitted”.

    The “trees” might be placed in “remote desert areas”. I know that currently Arizona, according to the non-peer-reviewed Scientific American magazine state that Arizona seems to be open to projects like this (or massive solar farms to be more precise).

    This would of course only be a “bandaid” and would not solve the Co2 emissions problem…but it might help stop or slow down or reduce the higher emission scenario tipping point\violent climate change\really nasty possibilities.

    I’m not for it or against it. I just don’t know enough about it.

    http://www.SciAm.com Jan 08, “A Solar Grand Plan” pp. 63-71

    Comment by Richard Ordway — 6 Jun 2008 @ 2:53 PM

  127. No, Rod. RC threads have been over this same question several times before, I am pretty sure you were in the threads in the past.

    Recall, natural processes have handled about half the added CO2 from human activity, and the remainder is in the atmosphere. You can always find new publications on this easily using Google and care reading the result. One just as an example, you will find others:

    The Science of The Total Environment
    Volume 277, Issues 1-3, 28 September 2001, Pages 1-6
    doi:10.1016/S0048-9697(01)00828-2
    The biogeochemical cycling of carbon dioxide in the oceans — perturbations by man
    David W. Dyrssen

    Abstract (excerpt):

    “… Using data on the emission of CO2 and the atmospheric content in addition to the value recently presented by Takahashi et al. for the net sink for global oceans the following numbers have been calculated for the period 1990 to 2000, annual emission of CO2, 6.185 PgC (Petagram=1015 g). Annual atmospheric accumulation, 2.930 PgC. …”

    Click the ‘Cited by’ and “Related articles” links for more info as usual.

    Comment by Hank Roberts — 6 Jun 2008 @ 3:50 PM

  128. Re: #121: I think your math is way off. There are about 3000 gigatonnes of CO2 in the atmosphere. Anthropogenic emissions are about 30 gigatonnes a year.

    Comment by Erik Ramberg — 6 Jun 2008 @ 5:36 PM

  129. Rod B (121) — There are several decent web sites about the carbon cycle. Just now, in effect, the oceans (and other sinks) are absorbing as much as possible as fast as possible. Compared to recent human activities, that’s not fast enough to provide balance, i.e., equilibrium, yet.

    Wait quite a few centuries…

    Comment by David B. Benson — 6 Jun 2008 @ 6:24 PM

  130. The idea of the CO2 sink not absorbing is not that far fetch. Deforestation, land clearing, agricultural use, rising sea temp, the sink is decreasing in size and capacity when not downright becoming a source itself.

    Comment by Philippe Chantreau — 6 Jun 2008 @ 6:39 PM

  131. Re #119,120,121 Chris et al, Thank you, I understand the basic idea and have read the IPCC section on C12O2/C13O2 ratio etc. Thank you for the Law Dome reference. I can’t figure out how you can attribute the entire rise in CO2 since 1900 to humans without showing NO change in non human sources, ocean degassing, biosphere CO2 resorbtion and a host of other variables which can not be known for the earlier part of the 1900′s and are probably not known today. The idea that biological systems are not maliable and can not respond to changes in environment is not easily supported. Since it is known that plant growth is in general accelerated by increased levels of CO2, it seems possible that there are gigaton increses in the mass of trees plants, algea, seaweed etc on a global scale. You may be interested in this NASA note: http://science.nasa.gov/headlines/y2001/ast07sep_1.htm

    [Response: You clearly haven't understood what you have been reading. Ocean degassing would be incompatible with the C13 record, and in any event it can be directly shown by ocean monitoring that oceans are taking up CO2, not releasing it. And the land biosphere is indeed taking up CO2, not releasing it -- which together with the ocean offsets part of what we are putting into the atmosphere. So, if the oceans are a sink and the land biosphere is a sink, what can be causing an increase in atmospheric CO2 other than the additional (and well documented) amount put into the atmosphere by fossil fuel burning and deforestation? Where is all that carbon supposed to be going, hmm? --raypierre]

    Comment by Gary — 6 Jun 2008 @ 7:20 PM

  132. Martin,

    I don’t see how the existence 19 models with sensitivity above 2C comes close to demonstrating that “it is nearly impossible to come up with a generally applicable GCM with a sensitivity much less than about 2 degrees”.

    Comment by Steve Reynolds — 6 Jun 2008 @ 7:31 PM

  133. Mark, 9 gigatonnes (I assume tonnes is correct) is (about) what % of the total flora?

    My percent figures are based on total atmosphere. I’ll have to think that through…

    Comment by Rod B — 6 Jun 2008 @ 11:08 PM

  134. Steve,

    think about it. Nineteen groups independently (more or less) try to build models of the climate system where the requirement is that they be as physically realistic as possible. Not a single one comes up with a model having a sensitivity under 2 degrees. To me that strongly suggests that physical realism and low sensitivity are incompatible.

    It’s a statistical argument (sampling the population of physically realistic models), but a pretty strong one I would say.

    Comment by Martin Vermeer — 7 Jun 2008 @ 3:48 AM

  135. Totally off topic, but the posts on carbon sequestration are closed. On 30 May, The New York Times, along with a couple of other papers (including the FT, I think), announced that any hope for the often-advertised “clean coal” solution was -for all practical purpose- abandoned.

    Comment by Francois Marchand — 7 Jun 2008 @ 4:32 AM

  136. Re #118,

    I think it’s inappropriate for the World Meteorological Organization (and NOAA) to continue to use a definition of climate to be “mean regional or global weather over a period of 30 years or more” … climate is changing rapidly.

    Comment by pat n — 7 Jun 2008 @ 6:37 AM

  137. Steve Reynolds, The short and flippant answer in the face of the fact that such a beast does not exist is–”You’re welcome to try.” The thing is that for a dynamical model, you have to fix (or at least constrain) the forcing with independent data, and the data simply aren’t compatible with a forcing that low. The thing is that CO2 forcing has some fairly unique properties in that it provides a consistent, long-term forcing and remains well mixed into the stratosphere.

    Comment by Ray Ladbury — 7 Jun 2008 @ 7:07 AM

  138. Erik, what’s your source for the 30 gigaton/year number from fossil fuels that you posted above?

    Below a link to a recent MIT course outline page for
    petagram=gigaton
    and other useful information.

    Climate Physics and Chemistry Fall 2006 Ocean Carbon Cycle Lectures
    Carbon Cycle 1: Summary Outline
    http://ocw.mit.edu/NR/rdonlyres/Earth–Atmospheric–and-Planetary-Sciences/12-301Fall-2006/66D1CB48-09FA-4CF5-8D9D-9C41834EEA94/0/handout.pdf

    “… CO2 rise (at present) in atmosphere is about half of the rate of fossil fuel consumption – where does the rest go? Answer to follow: about half goes into the ocean and about half goes into organic matter storage (almost certainly on the continents)…”

    Comment by Hank Roberts — 7 Jun 2008 @ 10:35 AM

  139. Rod — I don’t know where you got your percentage figure. There are about 3 x 1015 kg of CO2 in the air. We add about 3 x 1013 kg per year, about 1%. A bit more than half of that gets absorbed, so net CO2 goes up by about 0.4% per year.

    Comment by Barton Paul Levenson — 7 Jun 2008 @ 11:56 AM

  140. Just a relevant aside: it’s 103 °F in Norfolk VA right now, beating the old record from around 1900 by five degrees …

    Comment by Neil B. — 7 Jun 2008 @ 2:50 PM

  141. Ray: …for a dynamical model, you have to fix (or at least constrain) the forcing with independent data, and the data simply aren’t compatible with a forcing that low. The thing is that CO2 forcing has some fairly unique properties in that it provides a consistent, long-term forcing and remains well mixed into the stratosphere.

    For my hypothetical model, I will accept the standard CO2 forcing, but that by itself only gives about a 1C sensitivity. The rest is feedback, both positive and negative. I’m sure that it is possible within the uncertainty in cloud and other effects, to produce a model with net sensitivity less than 2C.

    For Martin: I wonder how many times someone has done this and been told to fix their model so that it agrees with the expected sensitivity? Now there is paleo climate data and volcanic recovery time data that does say 3C sensitivity may be more likely, but I have not seen anything from models that says less than 2C is nearly impossible.

    Comment by Steve Reynolds — 7 Jun 2008 @ 5:47 PM

  142. There is also a heat wave in Norway. Good for ice cream vendors…

    Comment by David B. Benson — 7 Jun 2008 @ 5:58 PM

  143. Steve, again, the feedback is part of the package–it doesn’t matter whether the warming is due to insolation or greenhouse gasses. Models operate under many constraints–there’s not a lot of wiggle room, and there’s less wiggle room for CO2 and other greenhouse gasses than many other causes. This is not a Chinese menu where you choose one from column A, one from column B … until you match the trend.

    So you are welcome to try, but I rather doubt that your model will match the data (all the data) very well.

    Comment by Ray Ladbury — 7 Jun 2008 @ 9:06 PM

  144. Re # 141 Steve Reynolds
    “I wonder how many times someone has done this and been told to fix their model so that it agrees with the expected sensitivity?

    So, who would tell the modelers to fix their models?

    [Can one of the RC moderators tell me why this thread, unlike all the other RC threads, doesn't save my name and email address in the Leave a Reply fields, or whatever they are called?]

    Comment by Chuck Booth — 7 Jun 2008 @ 9:22 PM

  145. For Martin: I wonder how many times someone has done this and been told to fix their model so that it agrees with the expected sensitivity?

    But sensitivity is not a tunable parameter in them… as I understand there are tunable parameters especially in the cloud/aerosol parametrization which involves a difficult upscaling from cloud scale to model cell scale, but that tuning is done to produce realistic model behaviour for known test situations. Which brings us back to physical realism. Yes, as a side effect this may pin sensitivity to a 2C-plus value… my argument.

    Comment by Martin Vermeer — 8 Jun 2008 @ 1:43 AM

  146. Barton (139) yes, using CO2 into CO2 I get 0.9% emitted each year, which does seem to be more relevant (though I’m still mulling…) than CO2 added to total atmosphere.

    Comment by Rod B — 8 Jun 2008 @ 1:55 AM

  147. RE #137

    As a matter of fact, there is no unifrom temporal and spatial distribution of CO2 in the air. The conc of CO2 in the atmosphere as determined by analysis of local or real air at Mauna Loa or any site is always reported for standard dry air, which is dry air at 273.2 K and 1 atm pressure and comprised of only nitrogen,oxygen and the inert gases. Standard dry air presently has ca 385 ml of CO2/cu meter. Real air is a term for local air at the intake ports of air separtion plants and in the HVAC indusries.

    The relative ratios of the gases in the air is quite uniform around the world and is independent of elevation, temperature and humidity. This the origin of the term “well-mixed gases”.

    The absolute or actual amount of the gases is site specific and depends mostly on elevation, pressure, temperature, and humidity.

    For example, if standard dry air is heated to 30 deg C with pressure at 1 atm the amount of CO2 is about 350 ml/cu meter. If this air are were to become sat with water vapor (ca 5% for 100% rel humidity) the amount of CO2 is about 330 ml/cu meter. If stand. dry air is cooled to -40 dg C at 1 atm pressure, the amount of CO2 is about 450 ml/cu meter. This is the origin of the polar applification effect.

    For a mean global temp of 15 deg C and about 1% humidity, the amount of CO2 is 365 ml/cu meter.

    In text books the ideal gas law is usally given as PV=nRT which be rearranged to n/V=P/TR, where n is the sum of the moles of gases in the mixture and R is the gas constant. Locally, the amount of CO2 and all the other gases will fluctuate with the weather. Water vapor will vary the most over wide ranges.

    CO2 is fairly soluble in water, and exchange clouds can effect local distributions.

    Ther are some images from the AIRS that show that the distribution of CO2 is not constant especially over rhe land.

    Comment by Harold Pierce Jr — 8 Jun 2008 @ 4:02 AM

  148. Steve Reynolds writes:

    I’m sure that it is possible within the uncertainty in cloud and other effects, to produce a model with net sensitivity less than 2C.

    So produce one. Prove your point.

    For Martin: I wonder how many times someone has done this and been told to fix their model so that it agrees with the expected sensitivity?

    What makes you think any such events have occurred at all?

    Comment by Barton Paul Levenson — 8 Jun 2008 @ 5:58 AM

  149. Steve Reynolds, I think that you are attaching way too much importance to the adjective “impossible” and too little to the adverb “nearly”. You will also note the modifiers of GCM–”generally applicable”. Think about the characteristics of CO2 forcing. It is a net positive and acts 24/7 and over a period of hundreds of years. It acts at all levels of the troposphere. It is not unreasonable to think that a forcing with such unique characteristics will leave a pretty unique signature. You could probably “mimic” such a signature with a combination of factors, but it would probably mean taking values other than the most probable. It’s a question of probabilities and confidence intervals.

    Comment by Ray Ladbury — 8 Jun 2008 @ 6:38 AM

  150. Harold Pierce, Jr., note that I did not say or imply a uniform concentration, but said well mixed. Uniformity is not necessary for the green house effect to take place. Mixing to high in the troposphere and above is.

    Comment by Ray Ladbury — 8 Jun 2008 @ 9:49 AM

  151. Ray, BPL, et al: A question that may be old hat and I just missed it: why would the natural CO2 sinks absorb a percentage of emissions versus the capability of absorbing a fixed amount? In other words if the sinks are absorbing 14 gigatonnes of CO2 in ~ 2004 at about 50% of emissions, why are they not capable of that absorption always — which would have absorbed ~all of the emissions through about 1970 and so show zero increase in CO2 atmospheric concentration from 1800 to ~1970?

    btw, I agree that CO2 emissions compared to the total weight of the atmosphere (my earlier post) has no probative value.

    Comment by Rod B — 8 Jun 2008 @ 2:15 PM

  152. Hank (127), looks like a good reference; may answer some of my questions and save you guys’ time! A quick clarification: do you know why the largest C/CO2 sink, carbonaceous rock was omitted in the discussion? Because its time interval is so long and “non-interesting”? Or is there something else?

    Comment by Rod B — 8 Jun 2008 @ 2:27 PM

  153. RE: #150 Hello Ray! GO:

    http://www-airs.jpl.nasa.gov/index.cfm?fuseaction=ShowNews_DynamicContent&NewsID=10

    Note the scale on the image: It is for standard dry air, but this is mis-leading. At 25,000 ft the air pressure is 6 psi, and amount of CO2 would be 154 ml/cu meter without a temperature correction.

    Check out the video.

    If a model run starts with a value of 385 ppmv fot CO2, the projected results with be a little bit too high. Models should start with ca 365 ppmv as calculated above. This may be the reason the mid troposhere is a little cooler than projected.

    ATTN Gavin

    How are your newer climate models incorporating the CO2 distribution data from the AIRS satellite? This would seem really important for the polar regions. The guys over at the AIRS place imply that you modelers are looking into this.

    Comment by Harold Pierce Jr — 8 Jun 2008 @ 3:44 PM

  154. “Rod B Says:
    6 June 2008 at 11:08 PM

    Mark, 9 gigatonnes (I assume tonnes is correct) is (about) what % of the total flora?”

    Rod, that would be 9 gigatonnes EACH YEAR of EXTRA plant growth. Add to that the reduction of the rainforests. But even so, where are these nine billion tons of new growth over and above the natural rate hiding themselves?

    All you’ve managed to say is “well, it’s not a lot compared with the total, is it?”. Which doesn’t really answer the question, does it. It answers the question you’d like it to have been.

    Comment by Mark — 8 Jun 2008 @ 5:45 PM

  155. Ray Ladbury: …but I rather doubt that your model will match the data (all the data) very well.

    Martin: …but that tuning is done to produce realistic model behaviour for known test situations.

    Finally back to the thread topic! The data is so uncertain (and changing) that a very broad range of models could claim to match it nearly equally well.

    And, while I know the time is too short to prove anything, the lack of warming this century (according to satellite data) does not match high sensitivity model predictions very well.

    Comment by Steve Reynolds — 8 Jun 2008 @ 6:15 PM

  156. Re 151–the question of why sinks absorb a percentage rather than a fixed amount. I think that it has to do with the fact that they are not becoming fully saturated. In general reaction rates are proportional to concentrations unless one reactant saturates.

    Harold, You seem to be implying that CO2 is 154 ppmv at 8 km–and that’s not what the link shows. Indeed, it appears quite consistent with 385 ppmv.

    Comment by Ray Ladbury — 8 Jun 2008 @ 7:38 PM

  157. Rod, time interval; rate of change of rate of change. Most biogeochemical cycles are far, far slower than the rate at which human fossil fuel use is increasing CO2 in the atmosphere.

    Natural sinks change over time; plant growth for plankton may bloom in one season and take up more CO2; fish and whale populations may increase in a few decades, and consume more plankton; erosion of exposed minerals changes slowly unless there are extreme rainfalll/erosion events.

    Look — seriously, look, read, for example the study linked below — you can expose a lot of fresh material with events like the sudden massive erosion at the end of the PETM when climate changed extremely fast compared to ‘normal’ rates.

    This kind of change would wipe much of our coastal urban development off the map with severe flooding, if it happens again:

    Abrupt increase in seasonal extreme precipitation at the Paleocene-Eocene boundary

    B Schmitz, V Pujalte – Geology, 2007 – ic.ucsc.edu

    http://ic.ucsc.edu/~jzachos/eart120/readings/Schmitz_Puljate_07.pdf

    ——excerpt from abstract———

    … during the early, most intense phase of CO2 rise, normal, semiarid coastal plains with few river channels of 10–200 m width were abruptly replaced by a vast conglomeratic braid plain, covering at least 500 km2 and most likely more than 2000 km2. This braid plain is interpreted as the proximal parts of a megafan. Carbonate nodules in the megafan deposits attest to seasonally dry periods and together with megafan development imply a dramatic increase in seasonal rain and an increased intra-annual humidity gradient.

    The megafan formed over a few thousand years to 10 k.y. directly
    after the Paleocene-Eocene boundary. Only repeated severe floods
    and rainstorms could have contributed the water energy required
    to transport the enormous amounts of large boulders and gravel
    of the megafan during this short time span.

    The findings represent evidence for considerable changes in regional hydrological cycles following greenhouse gas emissions….”

    Comment by Hank Roberts — 9 Jun 2008 @ 12:12 AM

  158. Steve Reynolds #155:

    The data is so uncertain (and changing) that a very broad range of models could claim to match it nearly equally well.

    and yet, not one in nineteen manages to come up wirh sub- 2C sensitivity… repeating your claim doesn’t make it any more true.

    And, while I know the time is too short to prove anything, the lack of warming this century (according to satellite data) …

    Steve, your caveat shows you’re learning, slowly. Now repeat after me: a trend hasn’t changed unless and until the change is statistically significant.

    You’re getting there.

    Comment by Martin Vermeer — 9 Jun 2008 @ 3:53 AM

  159. Steve Reynolds, First, we are talking about physical models, not statistical models. That is, you have to start with physical processes that you know are actually occurring. Then you use independent data to constrain those processes. Now it is true that any single data set may not provide a tight constraint. However, as different phenomena are analyzed and more data are added, the constraints for having the forcing explain all the data become increasingly tight. Because CO2 has properties as a forcing that are not shared by most others, it is a fairly easy one to constrain. Now it is true that each single dataset probably assumes some statistical model, but it would not be enough to show that any single model is wrong. You’ll have to find fault with most of them. In effect you would have to show that most of the correct models are either peaked much lower or at the very least were skewed leftward. If you just spread out the probability distribution to the left and right, you will wind up increasing the probability of sensitivity higher than 4.5 degrees, and that would make the risk for this portion of the curve even more dominant. Be very careful what you wish for with respect to the models. I for one am hoping they are right wrt the most probable value for sensitivity, as that at least preserves some chance of avoiding dangerous warming without draconian measures.

    Comment by Ray Ladbury — 9 Jun 2008 @ 7:45 AM

  160. Ray: Because CO2 has properties as a forcing that are not shared by most others, it is a fairly easy one to constrain.

    I have not been questioning CO2 forcing. Feedback effects are the issue, especially clouds. I have still not seen any evidence about model physical parameters that the feedback effects are constrained to prevent sensitivity values less than 2C.

    How do you know that modelers have not chosen parameters (affecting clouds for example) that determine feedback to produce a sensitivity consistent with paleoclimate and volcanic recovery time?

    Comment by Steve Reynolds — 9 Jun 2008 @ 9:10 AM

  161. Only repeated severe floods and rainstorms could have contributed the water energy required to transport the enormous amounts of large boulders and gravel of the megafan during this short time span.

    So perhaps this weekend’s rain is a mere prelude for what is yet to come. Great. Really looking forward to it.

    Comment by Thomas Lee Elifritz — 9 Jun 2008 @ 10:09 AM

  162. Steve, the feedbacks don’t depend on the forcing, since they are thermally activated. That knob is not unconstrained either.

    Comment by Ray Ladbury — 9 Jun 2008 @ 10:53 AM

  163. Mark (154), I never said 9 gigatones of flora was “not a lot compared with the total”. I’m just asking the question: 9 gt is how much compared to what the base is. I have no idea. Thought you might.

    Comment by Rod B — 9 Jun 2008 @ 11:16 AM

  164. RE: #156 Hello Ray!

    The relative concentration is 385 ppmv on the mole fraction scale at STP for dry air. The absolute amount of CO2 is 156 ml/cu meter at this elevation, and the absolute concentation is 0.000 007 moles/liter.

    At sea level the absolute conc of CO2 for dry air is 0.017 moles/liter. For air with 1% water vapor, the absolute conc is 0.45 moles/liter. The ratio of H2O molecules to CO2 molecules is about 27 to 1.

    Real air has this very important property: It always contains water vapor. With respect to the greenhouse effect, water vapor is auto-forcing and auto-feedback.

    Comment by Harold Pierce Jr — 9 Jun 2008 @ 1:01 PM

  165. Harold, A 15 micron photon traversing that same liter of air will encounter about a billion CO2 molecules within a radius of its wavelength. And it will continue to do so well into the upper troposphere.

    Comment by Ray Ladbury — 9 Jun 2008 @ 1:50 PM

  166. Ray: That knob [feedbacks] is not unconstrained either.

    While of course there are some constraints, how tight are they?

    What I’m looking for is a credible source that can answer the question: How much are the feedback effects in climate models constrained by known physics and high confidence real world data?

    Comment by Steve Reynolds — 9 Jun 2008 @ 1:54 PM

  167. Steve, the feedbacks for CO2 will be pretty much the same as any other forcing. That implies a fair amount of data, and fairly tight constraints. Perhaps Gavin or Raypierre could help us out here.

    Comment by Ray Ladbury — 9 Jun 2008 @ 2:23 PM

  168. “# Rod B Says:
    9 June 2008 at 11:16 AM

    Mark (154), I never said 9 gigatones of flora was “not a lot compared with the total”. I’m just asking the question: 9 gt is how much compared to what the base is. I have no idea. Thought you might.”

    Well, someone else did the calculation. About 3KG per square metre.

    That would be each year. And for each square metre whether it could hold plants (though you may get some back on gently sloped hills).

    And, I think, they used a figure for “best Carbon content” of plant matter. There’s water in there too, y’know.

    And you still haven’t figured out why total airmass matters.

    Comment by Mark — 9 Jun 2008 @ 2:24 PM

  169. Steve Reynolds asks:

    “What I’m looking for is a credible source that can answer the question: How much are the feedback effects in climate models constrained by known physics and high confidence real world data?”

    Well, if the IPCC scientific papers aren’t considered a credible source, first we must find a credible source.

    Know of anyone?

    Now what we CAN infer is that if even the denialists haven’t trotted this point out, it’s likely solid. After all, they harped on for AGES about the hockey stick. So, lacking anything other than the lack of attack on this subject, we have a baseline conclusion of “the constraints are solid”.

    That shouldn’t rely on finding a “credible source”, either.

    Comment by Mark — 9 Jun 2008 @ 2:27 PM

  170. Steve Reynolds #160:

    How do you know that modelers have not chosen parameters (affecting
    clouds for example) that determine feedback to produce a sensitivity
    consistent with paleoclimate and volcanic recovery time?

    If there’s a modeller out there smart enough to pull that off, I’m sure Gavin would hire him :-)

    You see, again, sensitivity is not a tunable parameter. It doesn’t help if you think you know what the sensitivity should be from extraneous sources like palaeo and Pinatubo. You cannot do anything with that knowledge through the cloud parametrization. It would be like blindly poking a screwdriver into a clock and turning it, in the hope it will run on time.

    You can only improve cloud parametrization by — studying clouds. Real clouds, or clouds modelled physically in a high-resolution model called CSRM (Cloud System Resolving Model). And then you test how well you’ve done — not on palaeo or Pinatubo, but e.g., on weather prediction! The physics is largely the same… and surprise surprise, as cloud modelling performance improves by this metric, out comes the right reponse to Pinatubo and palaeoclimate forcings.

    But it’s a painfully slow process. See http://www.sc.doe.gov/ober/berac/GlobalChange.pdf

    Comment by Martin Vermeer — 9 Jun 2008 @ 4:50 PM

  171. Steve Reynolds (166) — Some of what I believe you are after can be found in Ray Pierrehumbert’s

    http://geosci.uchicago.edu/~rtp1/ClimateBook/ClimateBook.html

    Comment by David B. Benson — 9 Jun 2008 @ 7:02 PM

  172. Re 131 and Raypierre response
    I am not an expert in this field but I am able to review the CO2 information and I still have a problem with the claim that all of the increase in the past 100 years is due to humans. The IPCC says that over 99% of CO2 is C12 isotope and the claim is that the change in the less than 1% C13 isotope indicates that humans have caused the increase in CO2. They quote Francey, R.J., et al., 1995: Changes in oceanic and terrestrial carbon uptake
    since 1982. Nature, 373, 326–330. to show that you can distinguish between human and other causes of CO2. Looking at this article and since seasonal and longer term variation in CO2 are know to exist, any decrease in natural C12O2 output would cause an apparent increase in C12/C13 ratio. In fact the article specifically notes ” we observe a gradual decrease in 13C from 1982 to 1993, but with a pronounced flattening from 1988 to 1990″ and ” The flattening of the trend from 1988 to 1990 appears to involve the terrestrial carbon cycle, but we cannot yet ascribe firm causes.” This would indicate to me that natural causes for a change in C12/C13 can not be ruled out. I must also question how accurately these changes, representing such a small part of the atmosphere, can be made? Error bars anyone?

    Comment by Gary — 9 Jun 2008 @ 7:40 PM

  173. #52 Alastair McDonald:
    >You may wish to declare your Australia a global warming free zone but …

    He’s not the only one. Australia has one of the most ardent inactivist movements on the planet, supported by a powerful coal lobby and incompetent press.

    The oil price trend is kind of rough justice on all those people who thought climate change and peak oil were hoaxes and bought SUVs.

    #82 Timo Hämeranta:
    >When discussing sea surface temperatures please see the following new
    > study about sea level rise:
    [...]
    > They reconstructed approximately 80 mm sea level rise over 1950–2003,
    > i.e. a rise of 1.48 mm yr−1. This rise is below the estimate of the
    > IPCC and have no acceleration.
    >
    > And after 2003 sea levels are falling…

    Sorry, I can’t see where in the paper you cite they discuss levels after 2003. Also, it seems to me that the paper is about exploring alternative methodologies with some uncertainty as to whether they are yielding credible results, not about validating or overturning previous results.

    #90 Timo Hämeranta:
    > Everyone, whether neutral, ‘mainstream’, alternative, critical or sceptical,
    > who proclaims certainty, only proves poor knowledge, false confidence and
    > lack of scientific understanding (& behavior).

    Good that you point that out. The inactivist crew in general exhibit this flaw to the highest degree because arguing that you should do nothing in the face of a potentially extreme threat requires supreme confidence that you’re right. I have yet to see an article by Bob Carter (for example), who has the scientific background to know the difference between science and absolute certainty, admitting that there is a scintilla of probability that he could be wrong.

    Take for example the threat of the West Antarctic Ice Sheet collapsing in the next century. What probability of that would we tolerate before extreme measures to tolerate climate change? 10%? 5%? 1%? 0.01%? You can argue that even with only a 1 in 1000 chance of it happening, the cost is so high, you’d pay the cost of prevention. Those who cannot accept on ideological grounds that humans are altering the climate in potentially harmful ways demand 100% certainty. You’re quite right to point out that that is not science (more like religion in my opinion).

    Unfortunately as Richard Feynman told us, you can’t fool nature…

    Comment by Philip Machanick — 9 Jun 2008 @ 8:55 PM

  174. RE #30 & the much worse condition of sampling.

    I’m thinking that if there is no bias in those sample errors you mentioned, and that they are “random,” then they might pretty much cancel each other out. So one would have to find some bias toward getting fairly consistently warmer or cooler samples than the actual SST being measured to have more serious problems. And even then if such a bias were found, it could be accounted for and corrections could be made.

    Comment by Lynn Vincentnathan — 9 Jun 2008 @ 10:10 PM

  175. Re Gary @172: “This would indicate to me that natural causes for a change in C12/C13 can not be ruled out.”

    When you find a natural source of the required size be sure to let us know. Until then it’s just another ‘what if’ straw to grasp at for those looking for a reason–any reason–to do nothing.

    Comment by Jim Eager — 9 Jun 2008 @ 11:02 PM

  176. Martin, your link does provide some useful info, but I think it makes my point rather than yours:

    “The behavior of clouds in a changed climate is often cited as the leading reason that
    climate models have widely varying sensitivity when they are given identical forcing
    such as that provided by a doubling of carbon-dioxide. [Cess et al.1989 and Weilicki
    et al. 1996] One important way to reduce this uncertainty is to build models with the
    greatest possible simulation fidelity when compared to the observations of current
    climate. While the climate modeling community is making important and
    demonstrable progress towards this goal, the predictions of all types of models –
    including models such as “super-parameterization” – will remain uncertain at some level.”

    Ray, yes, I agree that some help from Gavin or Raypierre would be useful here.

    Comment by Steve Reynolds — 9 Jun 2008 @ 11:28 PM

  177. Gary #172,

    You must realize that, while the CO2 content of the atmosphere is steadily increasing, the atmospheric CO2 is rapidly cycled all the time through the biosphere and ocean surface waters. These fluxes are many times the secular rate, but they very nearly cancel long term. This is why you see short term variations in atmospheric CO2 (the seasonal “ripple” on the Keeling curve, e.g., and ENSO-related fluctuations), and also in the d13C. The long-term trend in both is robust.

    The technique of isotope ratio measurement is not a problem in spite of the small numbers. More of a challenge is proper sampling, i.e., choosing locations far from sources/sinks to get values representative for the whole atmosphere. Just as with ppmv determination.

    As raypierre already pointed out, you will have to explain where the known-released CO2 from fossil fuel burning, about twice of what we see appear in the atmosphere, is going. It doesn’t do to postulate an unknown natural mechanism that magically just kicks in in the 19th century as we start burning fossil fuels (after having been stable for 8000 years or so), magically has just the same quasi-exponential growth signature, and magically just about the same size as fossil-fuel (plus deforestation) released CO2 if half of that would disappear into the oceans and biosphere… while failing at the same time to give said explanation.

    Comment by Martin Vermeer — 10 Jun 2008 @ 1:15 AM

  178. #173 Philip Machanick Says:
    “You can argue that even with only a 1 in 1000 chance of it happening, the cost is so high, you’d pay the cost of prevention.”

    That depends on the cost of prevention and how effective this prevention is. For example, let’s say your $200,000 home has a 1 in 1000 chance of buring down in 30 years. Would you pay for fire insurance if the policy was going to cost you $25,000/per year and only had a 1 in 100 chance of paying out due to fine print? Most people would simply live with the risk of being uninsured if those were the terms.

    Meaningful CO2 reductions are not going to happen as long as the world population is growing no matter what governments do. Even if there was some initial success at coercing people into a ‘carbon control regime’ you can bet that carbon limits would be tossed out the window at the first sign of a stalling economy. For that reason it makes no sense to pay the ‘cost of prevention’ if that cost involves restricting access to currently viable energy sources. Paying for R&D and hoping that a miracle occurs would likely be a worthwhile effort but it is not guaranteed to produce results either. Adaptation is the only plausible and cost effective strategy remaining.

    Comment by Raven — 10 Jun 2008 @ 3:26 AM

  179. Lynn #174:

    I’m thinking that if there is no bias in those sample errors you mentioned, and that they are “random,” then they might pretty much cancel each other out.

    The good news is that you don’t have to stop at “thinking”. There are ways to check and optain a precise correction. The second and fourth figure in the article show how it is done: you need redundancy, i.e., the same, or nearly the same, thing measured in more than one way. Co-located air and bucket temperatures. Co-located engine inlet and bucket temperatures.

    This is what fools statistically naive critics also of the land temperature record: for climatology it is massively redundant. We know when a measurement is wrong and don’t have to rely on Anthony Watts’ pretty pics for circumstantial evidence.

    BTW I would happily believe that individual bucket readings are off by 2C or even more :-)

    Comment by Martin Vermeer — 10 Jun 2008 @ 3:55 AM

  180. Re Raven 178:

    Providing a house insurance example with a wildly inflated estimates of insurance costs and understated risk does not in any way justify your assertion that it is pointless to work hard at reducing emissions. If your house burns down then you can build or rent another one. Not so with the planet. And most climate scientists with any credibility appear to be convinced that if we don’t radically reduce emission soon then very nasty impacts are more likely than not (rather than the one in a 1000 chance that you imply).

    Comment by Craig — 10 Jun 2008 @ 5:20 AM

  181. #172, Gary, over the long term, the bulk of the CO2 increase can be shown to be from fossil fuels (e.g. http://www.realclimate.org/index.php/archives/2004/12/how-do-we-know-that-recent-cosub2sub-increases-are-due-to-human-activities-updated/langswitch_lang/sw)

    However, that thread lacks quantitative analysis. As you point out, there are natural sources causing isotope ratio fluctuations in the short term and they could act in the long term as well. It would be nice to have a fully quantitative analysis of the fossil contribution to the past and current isotope ratio over various time scales.

    Comment by Eric (skeptic) — 10 Jun 2008 @ 5:50 AM

  182. re #172, Gary

    you need to distinguish variations in the rates of change of the 13C/12C ratio in atmospheric CO2 and the absolute changes. The bottom line is that the oceans and terrestrial environment are taking up excess CO2 released by humans (they are nett sinks), and therefore they cannot also be nett sources, even if occasionally the rates of uptake of CO2 from the atmosphere may vary, e.g. due to biomass burning (such as that of 1994/5 and 1997/8 [*]) when the rate of CO2 uptake by the terrestrial environment may slow considerably (or potentially reverse). However the latter are temporary changes in the rate of CO2 exchange with the terrestrial environment since plant growth following biomass burning will recover somewhat.

    Now it’s very likely that variations in 13C/12C (and absolute atmospheric [CO2]) in the early part of the “industrial age” were due to land clearance which will both have rendered the terrestrial environment a nett source, and also contributed to the reduction in atmospheric CO2 13C/12C, since plants and fossil fuel are similarly 13C-depleted. But that’s also an anthropogenic source of both atmospheric CO2 and reduced 13C/12C. In fact there’s evidence that the terrestrial environment was nett neutral for long periods in the late 20th century (e.g. throughout the 1980’s [**]).

    But the bottom line is that if one adds CO2 to the atmosphere, then this is going to suppress natural release of CO2 from the oceans and the terrestrial environment, since the terrestrial environment and especially the oceans are linked by equilibria in which CO2 partitions according to atmospheric concentration. In other words enhanced atmospheric CO2 results in enhanced physical partitioning into the oceans (easily measured by changes in ocean pH), and enhanced uptake by plants and the soil. Notice that each of these is potentially saturable in various respects. The only way’s one might get a positive feedback (higher CO2 resulting in enhanced natural CO2 release) would be if the CO2-induced greenhouse enbhancement resulted in temperature sufficiently high as to kill large amounts of biomass non-recoverably, or through the release of sequestered carbon in tundra – that sort of thing).

    But that’s not to say that natural process don’t lead to variations in the rates of change of atmospheric [CO2] and 13C/12C. That (and non-natural contributions to the terrestrial CO2 “budget”) is what the Francey Nature paper refers to I suspect.

    You asked about errors. The error in measuring the isotopic composition of carbon isotopes using mass spectrometry is small, and if one looks at graphs the error is normally encompassed within the size of the symbol. So for example (to cite yet another Francey paper”! [***]) the authors determine that the accuracy in measurement of 13C/12C to be around 0.015% (in other words if the 13C/12C ratio is given as delta 13C and has a value of -7.83% in 1996, that’s -7.83 +/- 0.015%). Any errors of any significance lie elsewhere (e.g. in the dating of gas in ice cores for historical measures of 13C/12C; or the possibilities for gas contamination and so on).

    [*]Langenfelds RL et al (2002) Interannual growth rate variations of atmospheric CO2 and its delta C-13, H-2, CH4, and CO between 1992 and 1999 linked to biomass burning, Global Biogeochemical Cycles 16 art # 1048.

    [**] Battle et al (2000) Global carbon sinks and their variability inferred from atmospheric O-2 and delta C-13, Science 287, 2467-2470.

    [***] Francey et al (1999) A 1000-year high precision record of delta C-13 in atmospheric CO2, Tellus Series B-Chemical And Physical Meteorology 51, 170-193

    Comment by Chris — 10 Jun 2008 @ 6:41 AM

  183. Steve, Note that the reference Martin supplied was advocating a way of calculating cloud feedbacks–not describing how they are done at present. Be that as it may, keep in mind that feedbacks affect all forcings equally. Thus if you change, for example, feedback due to clouds to preclude large temperature rises, this would seem to be contradicted by the fact that we have had large temperature rises in the paleoclimate.
    Ultimately, there’s a reason why climate models yield fairly consistent results–there’s just not that much wiggle room. Climate models aren’t telling us, qualitatively, anything we didn’t already know–they’re just making it possible to estimate how severe things could get. This is why I emphasize that if you are leary of draconian action motivated by panic, climate models are your best friends.

    Comment by Ray Ladbury — 10 Jun 2008 @ 7:56 AM

  184. Steve Reynolds asks:

    How much are the feedback effects in climate models constrained by known physics and high confidence real world data?

    Well, for an example, in 1964 Manabe and Strickler, in the first radiative-convective model, used a distribution of water vapor with height by absolute humidity. In a second model, Manabe and Wetherall 1967, they constrained relative humidity instead, using the Clausius-Clapeyron law to calculate how much vapor pressure was available at each level as the temperature changed. Research since then has shown that relative humidity does tend to stay about the same in vertical distribution (although not in every place from day to day). They improved the physics and got a better model. Note that they used nothing from climate statistical data; they simply improved the physics. This has been going on in model development since the 1960s.

    Comment by Barton Paul Levenson — 10 Jun 2008 @ 7:57 AM

  185. Raven, I don’t think anything you’ve said contradicts Philip’s point. The problem is that with the current state of knowledge, we cannot rule out truly catastrophic consequences–sufficiently catastrophic, that any effort that does not bankrupt us could be justified.
    To take your example–perhaps I might not pay for the insurance policy you posit, but I would invest heavily in protection equipment to preclude the possibility of myself and my family being caught in a burning house, even if the probability was, perhaps 1 in a hundred.

    Comment by Ray Ladbury — 10 Jun 2008 @ 8:13 AM

  186. Chris:

    “And most climate scientists with any credibility appear to be convinced that if we don’t radically reduce emission soon then very nasty impacts are more likely than not”

    Wow, “More likely than not” is a strong statement. Do you have some links or even names of some peer reviewed studies with quanititave probability outcomes at better than 50%?

    Ray Ladbury
    “Cannot rule out truly catastrophic consequences” – well said.

    Comment by DR — 10 Jun 2008 @ 10:09 AM

  187. Ray Ladbury: Climate models aren’t telling us, qualitatively, anything we didn’t already know–they’re just making it possible to estimate how severe things could get. This is why I emphasize that if you are leary of draconian action motivated by panic, climate models are your best friends.

    Of course motivation should not matter if we are trying to determine truth. However, I think that quantitative limits on sensitivity are currently more constrained by volcanic and paleoclimate data than they are by models (see James Annan’s papers).

    Comment by Steve Reynolds — 10 Jun 2008 @ 10:35 AM

  188. Barton: …They improved the physics and got a better model.

    That’s great and I hope similar progress eventually leads to accurate models based only on objective physics, but I think we have a long way to go to resolve the initial question.

    Let me restate my question to more clearly reflect the original discussion:

    Are the feedback effects in climate models sufficiently constrained by known physics and high confidence real world data to preclude a climate sensitivity less than 2C?

    [Response: Steve: You've got the wrong question. The right question is: are the feedback effects sufficiently constrained by known physics and high confidence real world data to preclude a climate sensitivity of 8C or more? At present, I'd argue that the answer to that is no. As long as there is a credible threat that 8C is the right answer (or even 4C), that threat has to be factored into the expected damages. It's no different than what you'd do when designing a dam in a region where there's a 1% chance of no earthquake over 100 years and a 1% chance of a very severe earthquake. And by the way, though I don't think you can completely rule out a sensitivity below 2C, such a low sensitivity looks more and more incompatible with the 20th century record and past events, notably the PETM. Finally, without models, we wouldn't even know that 8C is possible, or what kind of cloud feedbacks could give that value. For that matter, it's models that tell us a Venus type runaway greenhouse is essentially impossible. --raypierre]

    Comment by Steve Reynolds — 10 Jun 2008 @ 10:46 AM

  189. “Meaningful CO2 reductions are not going to happen as long as the world population is growing no matter what governments do.”

    Germany, France, Japan and the UK produce 1/2 CO2 per person the US, Canada and Australia do. Getting those three down to typical EU levels for per person CO2 emission would be a substantial cut globally.

    [Response: Actually France is more like 1/4, and is struggling to do even better than that. As yo note, if the US even met France halfway on percapita emissions, that would be a huge reduction -- and if China and India's future percapita emissions could be capped at the French level, which we know is technologically feasible and compatible with a prosperous society -- that would make a good interim target. It wouldn't be enough to prevent doubling of CO2, but it would delay the date of doubling and buy time for more technology and more decarbonization. But yes, preventing doubling, with a world population of 10 billion, will require that ultimately the energy system be almost totally decarbonized. Even a percapita target of 1 tonne C per person worldwide (somewhat less than French percapita emissions today) amounts to 10 Gtonne per year, which would double CO2 in around a century, even if the ocean and land continue to be as good sinks as they are today. --raypierre]

    Comment by L Miller — 10 Jun 2008 @ 10:57 AM

  190. Ray (165), this is probably worth nothing but maybe you could verify your math re the billion CO2 atoms. I get (but would bet little on it) with 380ppmv evenly distributed CO2 molecules ~2.15×10^3 molecules along a 10cm side (height) of a cubed liter, and 3.25×10^3 molecules in an area slice of 7×10^-10 sq.meters (circle area of r = 15 microns) for an encounter of ~7 million CO2 molecules or ~1/100th of your billion. Probably doesn’t change your basic point — still alot; maybe nobody cares, but I thought I’d ask…

    Comment by Rod B — 10 Jun 2008 @ 11:18 AM

  191. Re #189, And when the emissions are coming down we must also contain emissions due to economic growth to which is not going to be easy because even if we doubled fuel efficieny for every vehicle in the USA we would be still consume the same amount of oil seven years laters due to 2% economic growth per year.

    The task of decarbonising whilst meeting projected energy demand growth is daunting and requires some serious strategic thought which I personally doubt will happen in time to avoid >2C rises in global temperature.

    people keep on talking about CCS for coal but that aint gonna work in time but it may stop >3C warming I suppose.

    Comment by pete best — 10 Jun 2008 @ 11:26 AM

  192. Rod B., You’re right–forgot to divide by ~22.

    Comment by Ray Ladbury — 10 Jun 2008 @ 11:48 AM

  193. Here is a graph prepared by the Carbon Dioxide Information Analysis Center at ORNL of the yearly anthropogenic emissions of CO2:

    http://cdiac.ornl.gov/trends/emis/tre_glob.htm

    Comment by David B. Benson — 10 Jun 2008 @ 2:23 PM

  194. Re #176 Steve Reynolds:

    Martin, your link does provide some useful info, but I think it makes my
    point rather than yours:

    On the contrary — you’re changing the goalposts again.

    I provided the link to demonstrate that your rather libelous claim that modellers tweak the cloud parametrization in order to get more “appropriate” sensitivity values has no relationship with how the process of improving the parametrization works in reality. You fooled even Ray Ladbury on that one (Ray, read the article with comprehension.)

    At this point I usually give up. Horse, well, drink.

    Comment by Martin Vermeer — 10 Jun 2008 @ 2:28 PM

  195. raypierre:

    “It’s no different than what you’d do when designing a dam in a region where there’s a 1% chance of no earthquake over 100 years and a 1% chance of a very severe earthquake.”

    Your analogy is not a great one. This is NOT a situation where we are planning a greenfield factory, power plant or hospital. In developed countries we are talking about massive changes to the existing infrastructure at enormous expense that would have a debatable impact on a problem that is not unequivocally proven. In the developing countries we’re talking about restricting their rights to the same inexpensive energy that we built our wealth on.

    It just isn’t that simple. More like we’ve found out that 100,000 dams have already been built in areas with a chance of an earthquake. The technology exists to build better dams but it takes time, it’s expensive and we’re not the least bit sure we can replace a significant number of the dams before the major earthquake that scientisits tell us is coming. Do we fix the dams or buy boats?

    Comment by DR — 10 Jun 2008 @ 3:06 PM

  196. raypierre: Steve: You’ve got the wrong question. The right question is: are the feedback effects sufficiently constrained by known physics and high confidence real world data to preclude a climate sensitivity of 8C or more?

    Thanks for the response answering two questions.

    For the high sensitivity limit, is your answer just addressing results from models, or do you disagree with James Annan?:

    “Even using a Cauchy prior, which has such extremely long and fat tails that it has no
    mean or variance, gives quite reasonable results. While we are reluctant to
    actually endorse such a pathological choice of prior, we think it should at
    least be immune from any criticism that it is overly optimistic. When this
    prior is updated with the analysis of Forster and Gregory (2006), the long fat
    tail that is characteristic of all recent estimates of climate sensitivity simply
    disappears, with an upper 95% probability limit for S easily shown to lie
    close to 4C, and certainly well below 6C.”

    http://www.jamstec.go.jp/frcgc/research/d5/jdannan/prob.pdf

    Comment by Steve Reynolds — 10 Jun 2008 @ 3:52 PM

  197. DR (195) wrote “In the developing countries we’re talking about restricting their rights to the same inexpensive energy that we built our wealth on.” Fossil fuel isn’t so ‘inexpensive’ anymore, is it? So maybe they (and everybody) needs alternative sources of energy. The recent IEA report offers one plan. For many developing countries, pushing hard for biofuels seems to me to be extremely sensible. Many countries in the global south are already doing so.

    Comment by David B. Benson — 10 Jun 2008 @ 6:25 PM

  198. DR, If you feel the threat of climate change is not unequivocally proven, I wonder what additional proof you are looking for. After all, the IPCC models use the relatively conservative best-fit sensitivity of 3 degrees per doubling and still conclude that we will likely face serious consequences this century. A real risk calculation would have to use a higher bounding value for the sensitivity and would likely find that catastrophic results cannot be ruled out with confidence. It may not be a matter of fixing the dams or buying boats, but rather some of both–and both will take time. The only way we buy time is by decreasing CO2 emissions.
    As to your arguments about the industrial and developing world, WE are going to have to significantly alter our infrastructure to deal with the end of cheap fossil fuel. The developing world’s lack of infrastructure is actually an advantage, since we can significantly diminish the impact of their economic growth in return for assisting them with clean energy. We are certainly justified in investing as much in risk mitigation as we would lose if the threat becomes reality.

    Comment by Ray Ladbury — 10 Jun 2008 @ 7:01 PM

  199. DR says “…we are talking about massive changes to the existing infrastructure at enormous expense that would have a debatable impact on a problem that is not unequivocally proven.”

    That is where we disagree. What will it take to convince you? The physics is solid and its predictions, which are conservative since they do not include some of the positive feedbacks, are being realized daily. The Arctic ice cover is reaching historic lows and may be ice-free within a few years, something that has not happened for a million years. Australia and Spain are developing new areas of desert, California is considering rationing water, the Southeastern US is in prolonged drought. The WAIS and GIS are melting at ever faster rates, and on and on. A recent study shows that the warming rate may triple in the Arctic because of the melting of permafrost. That is the kick-in of a positive feedback, and represents a possible tipping point. Your level of demand for unequivical proof could well take us to irreversible change of a catastrophic nature. We dare not take that risk.

    Comment by Ron Taylor — 10 Jun 2008 @ 8:19 PM

  200. DR,

    even if a climate sensitivity >4.5 C could be completely ruled out (I would say that it could for purposes of political action, raypierre or others might disagree) it probably doesn’t matter, because a climate sensitivity at around 3 C is most likely. But this number is a just a scientific benchmark; there is nothing inherently special about 560 ppmv aside from being the “first doubling.” The world doesn’t end at 2x CO2 or 2100 or what have you, so if we keep burning, it keeps getting hotter. The people living 500 years from now won’t care too much whether the 8 C hotter temperature came from a doubling or a tripling or a quadrupling of CO2, only that it happened.

    Given that the CO2 we add will influence climate for thousands of years to come, there is probably an ethical (as well as socio-economic) responsiblity to do something about it as a change that we will get with a doubling of CO2 or more is on the same order of magnitude as a transition into an ice age (only going the other way). And as people like Hansen or Richard Alley say in the public, we may not just be looking at a rising trend, but also a more variable climate which flickers back and forth between different states on yearly to decadal scales, which is very difficult to adapt to.

    As for “solutions,” I only leave you with a few quotes from Daniel Quinn in his book “Beyond Civilization” without any implication or further comment:

    “Old minds think: How do we stop these bad things from happening?”
    “New minds think: How do we make things the way we want them to be?”

    “Old minds think: If it didn’t work last year, let’s do MORE of it this year”
    “New minds think: If it didn’t work last year, let’s do something ELSE this year.”

    Comment by Chris Colose — 10 Jun 2008 @ 8:57 PM

  201. DR,

    Neither, you’d move. But we can’t move. If we were to build the metaphorical boats the issue still remains our crops would be devestated, and so only a few of us would survive. That’s not acceptable so we would fix the dams the best we could as quickly as we could. The major earthquake should give us enough time to do that, but we’ve spent the last decade sitting here arguing that it’s not worth it when we can all learn to swim anyway.

    The changes are massive, yes, but necessary as your flood waters wouldn’t subside without us pumping them away.

    Comment by Russell — 10 Jun 2008 @ 9:33 PM

  202. Gary, you quote from a 1995 article then say “This would indicate to me that natural causes for a change in C12/C13 can not be ruled out. I must also question ….”

    Here are some 250 subsequent articles citing that one; you can check your supposition — likely someone else thought of the same questions you raise and may have published. Science works like that.
    http://scholar.google.com/scholar?num=100&cites=873523141873022731

    Comment by Hank Roberts — 10 Jun 2008 @ 10:02 PM

  203. #195 DR: Developing countries cannot get fossil fuel-based energy at the cost older countries did. Oil is already about 30% above the 1979 spike ($39, about $100 in today’s money). Coal prices have doubled in world markets over the last year. If every poor country rapidly converted over to the modes of energy use in Europe let alone the US, prices would go through the roof.

    As to the enormous expense, a coal-fired power plant doesn’t last forever. Instead of replacing them, we should be replacing them with clean power. If we couple this with aggressive efficiency drives, while cranking up the cost of carbon pollution, the cost need not be very high. Better: once you have renewables in place, you are into a Moore’s Law type cost reduction curve: as each plant wears out, you replace it by one that costs less. And your running costs don’t include consumables.

    Better still, renewables don’t rely on massive scale the way coal-fired plants do for low cost, so you can put them in inaccessible places or countries without a national grid.

    Contrary to this being a disaster for developing countries, it could be what saves them from being terminally behind.

    Your analogy is good. To take it a step further, you seem to want poorer countries who have not built dams yet to build more to the same shonky standard.

    Since this is a bit OT, can I invite follow-ups to go to my blog?

    Comment by Philip Machanick — 10 Jun 2008 @ 10:49 PM

  204. Re DR #186:

    The IPCC 4 was a review of the scientific literature and made it pretty clear that if emissions continue as is then we are likely or highly likely to experience warming that is likely to bring about significant impacts. If you want peer reviewed literature look into the list of papers and reports that they cite.

    The impacts of the current droughts in India and Austrlia demonstrate that you don’t need a huge shift in temperatures and rainfall to cause severe problems. The 5-year drought in India has brought impoverishment and starvation to 50 million people. The ongoing decade long drought in Australia is bringing about the ecological collapse of the Murray-Darling Basin river system, and bringing our once highly productive irrigation, broad-acre cropping and rangeland grazing sectors to their knees – raising prices domestically and and crimping our capacity export the huge food to the World.

    It is apparent from this report from the Royal Society that it is virtually inevitable that CO2 induced Ocean acidification will have very dire impact on oceanic ecosystem and thereby fisheries.

    I am continuously astounded by skeptics quibbling about scientists adjusting and improving their datasets as if this somehow disproves global warming and its predicted impacts, when the science is so clear and the implications are staring us in the face.

    I look forward to a delegation of skeptics and conservative politician heading off to central India to explain that it is not worthwhile trying to prevent predicaments such as theirs because it is much more cost effective for them to adapt than for us to attempt to reduce the risk that such catastrophes occur. And that there have been droughts before so it is natural, so there is no reason for us to risk reducing our growth in wealth by a few percent in order to prevent such events becoming more frequent.

    Comment by Craig Allen — 10 Jun 2008 @ 11:09 PM

  205. DR: “massive changes to the existing infrastructure at enormous expense.”

    Whether we want it or not, these massive changes will happen. If it’s not the climate, it will be the mere scarcity of resources mandating the change. We have a choice between starting now and having some level of control on it, and a little time, or to wait until the massive change happens on its own.

    There is no lack of signs, see the recent protests in Spain and their victims.

    I agree with you that it sucks to be a developing country at the time when everyone realizes that the “wealth” we thought was there turns out to be a liability as much as an asset. They won’t get to ride the illusion of wealth that we have enjoyed for a while. Kinda like a 18 year old who sees his parents digging their own hole into bankruptcy through debt and comes to the realization that he will have to refrain from all the niceties that they got out of it if he wants to follow a better path. However, is there really another viable option?

    Comment by Philippe Chantreau — 11 Jun 2008 @ 12:32 AM

  206. DR writes:

    In the developing countries we’re talking about restricting their rights to the same inexpensive energy that we built our wealth on.

    We also built our wealth on slavery and exploitation of native peoples. Should we allow the third world to use those as well, since they should have the same rights as us to development?

    Comment by Barton Paul Levenson — 11 Jun 2008 @ 3:05 AM

  207. RE: 197

    Forget all that biofuel nonsense. Methanex (Vancouver, BC) sells methanol for $1.50/US Gal, and it is used to make biodiesel. Wesport (Vancover, BC) has develeped methods for using methane and other light hydrocarbon gas formulations as fuel for Diesel engines. These engines are manufactured by Cummins are now in service in several BC Transit city buses. China will have about 100 of these buses in service for the Olympics. If the Wesport engineers can figures out how to use methane as Diesel fuel, they can certaintly figure out how to use methanol.

    FlexFuel cars use E85, which is a mixture of about 85% and 15% light hydrocarbons. The latter are added to lower the flash point of the mixture and promote smooth combustion of the mixture in engines using fuel injection. These hydrocarbons are also required for low temperature service.

    With modification, methanol could be used as fuel instead of ethanol in these engines. Racing cars use methanol, and they go really fast.

    Methanol is made for natural gas, and there is lots of it in ocean basin near Dubai and Qatar. The Japonese have several pilot projects under way to recover nat gas from permafrost.

    Comment by Harold Pierce Jr — 11 Jun 2008 @ 7:11 AM

  208. Steve Reynolds, A & H take a reasonable approach, but they run into the same problem that you do any time you use Bayesian statistics–how do you justify your choice of a prior. This was precisely the problem that caused Fischer to give up on Bayesian methods. When you use a uniform prior, at least you cannot be accused of having your prior dominate your data. However, until your data are so overwhelming that the choice of prior does not matter, any other choice requires justification. In particular, the choice of 3 degrees per doubling as the location parameter for the Cauchy prior could be questioned, and you could make arguments for either higher or lower. Likewise, you could question the scale parameter as well. The fact that the choice of prior makes a significant difference means that you must be very careful in your choice of prior and confident in your justification.
    One result that is particularly interesting is how dependent the results are on the level of constraint provided by the data. A reduction of 50% in constraining power could raise the upper limit by 1.5-2x and more than double the cost of mitigation.
    As to choice of prior (be it uniform, Cauchy or whatever), probably one of the strongest constraints is something along the lines of a weak anthropic argument: We’re here, so the sensitivity can’t be too high (or too low). This makes values much above 20 degrees and probably 10 degrees unlikely. (Note, another problem with the Cauchy is that it is defined from -infinity to + infinity, so it has to be artificially truncated for low values.
    The bottom line is that the data by themselves are not sufficient to constrain us to a level below 5-6 with much confidence. To do this you need to rely to some extent on the Prior. Also, notice that while they don’t quote lower limits, there’s very little probability of a sensitivity below 2 in these analyses.
    Finally, note that even if we can constrain sensitivity to below 4.5 degrees per doubling, that doesn’t necessarily mean we’re out of the soup. As Harold Pierce’s missive (#207) shows, even if there weren’t a risk that we’d start seeing outgassing from natural sources (and there is), we might recover and liberate the carbon in the permafrost and the clathrates ourselves.

    Comment by Ray Ladbury — 11 Jun 2008 @ 8:40 AM

  209. A correction to my #199: The possible tripling of Arctic warming is not due to the melting of permafrost, which is a consequence. The cause is the loss of Arctic sea ice. It is reported here by NSIDC:

    http://nsidc.org/news/press/20080610_Slater.html

    “The rate of climate warming over northern Alaska, Canada, and Russia could more than triple during extended episodes of rapid sea ice loss, according to a new study from the National Center for Atmospheric Research (NCAR) and the National Snow and Ice Data Center (NSIDC). The findings raise concerns about the thawing of permafrost, or permanently frozen soil, and the potential consequences for sensitive ecosystems, human infrastructure, and the release of additional greenhouse gases.

    ‘The rapid loss of sea ice can trigger widespread changes that would be felt across the region,’ said Andrew Slater, NSIDC research scientist and a co-author on the study, which was led by David Lawrence of NCAR. The findings will be published Friday in Geophysical Research Letters.”

    Comment by Ron Taylor — 11 Jun 2008 @ 8:41 AM

  210. Thanks to all. I appreciate the points that were made.

    Climate science and even just the possibility of CO2 driven AGW tells us that any new investment in infrastructure should be made with low or zero CO2 emmissions in mind. I’ve been a proponent of nuclear energy for years, thinking that we should cut down on pollutants in general and save the limited resources of fossil fuels for transportation. The U.S. should have been building nuclear plants for the last 30 years as Western Europe has but we didn’t due to activism. I hope our rising energy costs will spark innovation and clear the hurdles put in place by the “not in my backyard” crowd.

    To the issue of “unequivocal” proof. Not to change this to an English lesson but the science of AGW is not unequivocal. It is not an absolute, it’s not proven, it’s not fact, CO2 sensitivity is not X (as opposed to say gravity, where objects WILL accelerate at 32.2 feet per second, per second until they reach terminal velocity.) Obviously the problem is that if and when CO2 – AGW has been shown to be unequivocal much damage will have been done.

    I can tell you this, we are not in a runaway feedback condition, runaway feedbacks do not pause. This means that a negative feedback (PDO or AMO or reduction in TSI or whatever) is having an effect that may be temporary on global temperatures. This will cause the Oceans to sink more CO2 and give off less, it will by AGW theory reduce water vapor and low cloud cover which will further impact temperature.

    So now I have a question, in general terms, what positive feedbacks were used in the development of the GCM’s? What negative feedbacks were used?

    Comment by DR — 11 Jun 2008 @ 9:09 AM

  211. Re #210, GHG theory is scientifically sound and proven and as there is now an additional 220 billion tonnes of CO2 and other GHG gases in the atmosphere it can be determined that we are warming up somewhat in line with the expectations of GW theory. Add in land use changes and albedo effects and the balance of evidence states that AGW is hapenning and is not scientifically at any rate incorrect.

    The Arctic is warming faser than anywhere else on the planet and that is an accurate prediction/projection/forecast. The bernard/barnard convection cells that make up the ITCZ etc are expanding (was that predicted or be demonstrated on GCM?) and will cause deserts to move ever northewards.

    Scientifically speaking real climate has spend years spelling this out now and if someone wants to come around here arguing the opposite they had better have better science than RC do which is doubtful. Anyone would thing that NASA/GISS are making this all up for some reason in order to line their pockets no doubt (sigh)

    Comment by pete best — 11 Jun 2008 @ 9:35 AM

  212. DR, I will leave your question about feedbacks included in the GCMs to others more qualified. But I will continue to disagree about the meaning of “unequivocal.” No one has been able to produce a mechanism for warming of the past several decades without including the effect of anthropogenic greenhouse gases. How long do you think we need to continue the search for alternative explanations before we decide that it must be AGW?

    Comment by Ron Taylor — 11 Jun 2008 @ 9:47 AM

  213. #210 [DR] “I can tell you this, we are not in a runaway feedback condition, runaway feedbacks do not pause. This means that a negative feedback (PDO or AMO or reduction in TSI or whatever) is having an effect that may be temporary on global temperatures.”

    I can tell you this: you don’t understand feedback. Positive feedback is not the same as runaway feedback, as discussed repeatedly on this site.

    Comment by Nick Gotts — 11 Jun 2008 @ 9:54 AM

  214. DR, If you want unequivocal in that sense, might I suggest the Church (and perhaps a denomination other than the Unitarians would be appropriate). No, we cannot say CO2 sensitivity=x. In fact we cannot say that the acceleration due to gravity is x, but rather must say that it is x +/- dx with Y degree of confidence at this particular point on Earth’s surface. We can also say, with about 90% confidence that CO2 sensitivity is not much less than 2. So, since the subject matter is science and not language arts, I’d suggest that this is about as unequivocal as it gets.

    As to nuclear power, the main reason why we have not been building new plants has less to do with NIMBYism and more to do with concerns over proliferation (both wrt fuel and reprocessing). I, too favor increased use of nuclear power, but I do realize that proliferation and waste storage concerns remain to be addressed satisfactorily.

    Finally, a little lesson in physics. A system with positive feedback is not necessarily in a runaway condition, and it is not necessary for positive and negative feedbacks to balance (i.e. an infinite series can converge).
    Read about feedbacks here:
    http://www.realclimate.org/index.php/archives/2007/08/the-co2-problem-in-6-easy-steps/langswitch_lang/sw

    here:
    http://www.realclimate.org/index.php/archives/2007/10/the-certainty-of-uncertainty/langswitch_lang/sw

    here:
    http://www.realclimate.org/index.php/archives/2007/04/learning-from-a-simple-model/langswitch_lang/sw

    and here:
    http://www.realclimate.org/index.php/archives/2006/08/climate-feedbacks/langswitch_lang/sw

    to start with.
    You can read about feedback

    Comment by Ray Ladbury — 11 Jun 2008 @ 10:06 AM

  215. DR,

    science does not “prove” things, because 100% certaintly rarely exists in science. Everything comes with uncertainty, and political and economic decisions are made all the time in the face of it. Most of the decisions you make come without 100% confidence in the intended outcome. Maybe you shouldn’t spend money on an education because you might not get the job you want. Maybe you shouldn’t get in your car to work because there is a small chance you’ll get in a catastrophic accident. What the science shows is that there is very high confidence that the modern warmth is predominantly caused by greenhouse gases, and other human activity like deforestation, etc…and that “business-as-usual” consequences out to 2100 will not be negligible. The paradigm of AGW has not only been consistent with collected data sets, but has spawned a wide range of tests which have been borne out, and continues to hold remarkable explanatory and preditive power. Not much more you can ask for.

    A “runaway feedback” is a strawman. Of more importance than the end point (which is certainly not going to be like Venus) is the rate of change, and the associated consequences as we go there. Glacier melt and sea level rises leading to human displacement, loss of seasonal sea ice, temperature responses of ecosystems and agriculture, enhancement of precipitation gradients and droughts in already dry areas, etc. What matters for the climate sensitivity is the forcing and the sum of the feedbacks (i.e. the net effect) and that appears to be positive. The OLR is the big negative feedback which prevents a runaway, but net positive feedbacks (after the planck response, water vapor, clouds, albedo, etc) do not imply that the Earth will eventually heat up without bound. Keeping it simple, all that net positive feedbacks imply is that a doubling of CO2 will give a response greater than 1.2 C.

    I don’t think it’s correct to say “which feedbacks were used” since, for example, the effects of water vapor are not built-in assumptions but rather predictions by the model, or an emergent property. The short answer is that

    positive feedbacks= water vapor (largely in the upper, colder layers of the atmosphere) which moistens as temp goes up. Ice-albedo (the ratio of ice to ocean/land changes as the climate warms resulting in less reflection of incoming sunlight)

    negative feedbacks= the outgoing radiation (OLR) since Earth radiates more according to Planck’s law at higher temperature. With just this feedback (no others), a doubling of CO2 gives 1.2 C warming. Lapse rate is another one, since the strength of the greenhouse effect is effected by the strength of the temperature decrease with altitude.

    You also have the cloud feedback which we still don’t know the sign of with as much certainty as the others, but the best evidence seems to suggest that they are approximately neutral to positive. If there is a net negative feedback from clouds, it can’t be too strong since the paleoclimatic and modern observational record don’t suggest a low sensitivity and we have observations shwoing that lower clouds decline in a warming climate (low clouds control the albedo more than any other kind).

    Comment by Chris Colose — 11 Jun 2008 @ 11:22 AM

  216. Ray Ladbury: …A & H take a reasonable approach, but they run into the same problem that you do any time you use Bayesian statistics–how do you justify your choice of a prior. … When you use a uniform prior, at least you cannot be accused of having your prior dominate your data.

    Thanks for the thoughtful response, but one of the major points of the paper was that choosing a ‘uniform’ prior is inappropriate and does dominate the data!

    You may want to make your other points about the paper on Annan’s web site where he can answer them much better than I can:
    http://julesandjames.blogspot.com/2008/05/once-more-into-breech-dear-friends-once.html#comments

    Ray: Also, notice that while they don’t quote lower limits, there’s very little probability of a sensitivity below 2 in these analyses.

    They do quote a lower limit of 1.2C: “…present the results in Figure 2. The resulting 5–95% posterior probability interval is 1.2–3.6C…”

    Also, note raypierre’s response did not claim that sensitivity below 2C was ruled out by model results.

    Comment by Steve Reynolds — 11 Jun 2008 @ 11:37 AM

  217. Harold (207) says, “…Racing cars use methanol, and they go really fast…”

    Yes, but their MPG really sucks ;-) !

    I’ve always wondered why methane isn’t higher on the candidate list. I now very little of the specifics, though understand it has some really nasty inherent properties among maybe other problems.

    Comment by Rod B — 11 Jun 2008 @ 11:41 AM

  218. Ron (212), from a purely logical perspective, lack of a proven alternative does not, by itself, prove the primary and make it “unequivocal”, though proof of an alternative can disprove the primary.

    Comment by Rod B — 11 Jun 2008 @ 11:48 AM

  219. OT:
    http://www.sciencenews.org:80/view/generic/id/33092/title/Science_Academies_Call_for_Climate_Action

    National science acadamies from over twelve nations including the USA, the G8, China and India, Brazil among others issued a joint statement on climate change (“to limit the threat of climate change by weaning themselves off of their dependence on fossil fuels”.

    Comment by Richard Ordway — 11 Jun 2008 @ 12:07 PM

  220. Steve Reynolds–the prior dominates the results because there is not enough data to overcome its influence. That occurs regardless of which prior one chooses early in an Bayesian analysis. The thing is that if you choose an “intelligent” prior, its influence is still felt, but it is at some level prejudging the outcome of the analysis. For instance, what would have happened had A&H chosen 6 as the location parameter for their distribution? Or 12? Or 0.5? You’d wind up with a much different distribution. I’ve run into similar problems in my day job trying to come up with distributions that bound the radiation hardness of microelectronics using multiple types of data. It’s not an easy nut to crack.
    And yes, while Raypierre’s comment does not completely rule out, it’s bloody difficult to make a model with a sensitivity that low–and if you did, you’d probably have some pretty unphysical assumptions or results.

    Comment by Ray Ladbury — 11 Jun 2008 @ 1:29 PM

  221. Ray Ladbury wrote: “As to nuclear power, the main reason why we have not been building new plants has less to do with NIMBYism and more to do with concerns over proliferation (both wrt fuel and reprocessing).”

    The reason the USA has not been building nuclear power plants is that investors don’t like throwing money away. It’s not because “anti-nuclear activists” or “NIMBYs” object to it. It’s because Wall Street won’t touch it. Nuclear power is a proven economic failure — even after a half century of massive subsidies.

    That’s why the nuclear industry will not put a shovel in the ground to start building even a single new nuclear power plant unless the taxpayers underwrite all the costs and absorb all the risks — not only the risks of catastrophic accidents but the risks of economic losses. That’s why the nuclear industry has been aggressively (and successfully) pushing for hundreds of billions of dollars in federal subsidies, guarantees and insurance — like the half trillion dollars in nuke subsidies in the Lieberman-Warner climate change legislation that the Senate debated last week. Proponents of a nuclear expansion were unhappy with that bill because in their view it did not offer enough support for nuclear power.

    Comment by SecularAnimist — 11 Jun 2008 @ 1:40 PM

  222. #189: raypierre should have added that France’s great per capita CO2 performance comes mostly from it’s world record nuclear capacity: About 80% of the electricity produced in France is of CO2 poor nuclear origin. There is nobody in the world who matches this, and it is heartbreaking to see how Germany’s environmentalists have, by flatly refusing the nuclear option and clinching to the “Ausstieg”, put the country on the rails leading to energy poverty.

    Comment by Francis Massen — 11 Jun 2008 @ 2:09 PM

  223. DR, here is another point to consider when deciding when the evidence is sufficient to act. The evidence being considered is based on what is actually observed in the data. However, because of the thermal lag of the system, whatever is observed will only be a fraction of the guaranteed final result, no matter what is done. Stop greenhouse gas emissions totally today and the temperature (say the ten year mean of the annual global average) is expected to continue to increase for at least decades. So if you do not act until things get serious, you can be certain that they are going to get considerably more serious, very possibly catastrophic.

    Comment by Ron Taylor — 11 Jun 2008 @ 3:19 PM

  224. Francis Massen wrote: “France’s great per capita CO2 performance comes mostly from it’s world record nuclear capacity: About 80% of the electricity produced in France is of CO2 poor nuclear origin. There is nobody in the world who matches this”

    It is always surprising to me when nuclear advocates cite France as the world leader in nuclear electricity generation. France has 59 nuclear power plants generating about 63,000 MW. The USA has 104 nuclear power plants, generating about 99,000 megawatts — the largest number of nuclear power plants and the most nuclear generating capacity of any country in the world. The USA is the world leader in nuclear electricity generation, not France.

    The French-designed EPR reactor under construction at Olkiluoto, Finland, touted by the nuclear industry as the flagship of new reactor designs, is at least two years behind schedule and 50 percent over budget and has been plagued with safety and quality problems. This is of course normal for the nuclear industry.

    Francis Massen wrote: “… it is heartbreaking to see how Germany’s environmentalists have, by flatly refusing the nuclear option and clinching to the “Ausstieg”, put the country on the rails leading to energy poverty.”

    Germany is on track to become a powerhouse of clean renewable energy technologies. According to WorldWatch Institute:

    About 40,000 people are now employed in the PV industry in Germany alone, and the German company Q-Cells outproduced Japan’s Sharp to become the number one manufacturer worldwide. Germany remains the world’s top PV installer, accounting for almost half of the global market in 2007. Thanks to the country’s feed-in tariff for renewable electricity, which requires utilities to pay customers a guaranteed rate for any renewable power they feed into the grid, Germans installed about 1,300 megawatts of new PV capacity, up from 850 megawatts in 2006, for a total exceeding 3,830 megawatts. As capacity has risen, PV installed system costs have been cut in half in Germany between 1997 and 2007. PVs now meet about 1 percent of Germany’s electricity demand, a share that some analysts expect could reach 25 percent by 2050.

    Germany is also a world leader in wind-generated electricity:

    Germany remains the world leader in wind power capacity, with a total of 22,247 mega­watts, almost 24 percent of the global total … Wind power generated the equivalent of 7.2 percent of Germany’s electricity consump­tion in 2007. Windy northern Schleswig-Holstein now aims for the wind to generate all of that state’s power by 2020, up from 30 percent today.

    Worldwide, solar and wind generated electricity are the fastest growing forms of energy production by far. This is the direction we need to go in, not nuclear — which is the most expensive, least effective and — crucially — the slowest path to reducing CO2 emissions from electricity generation.

    Comment by SecularAnimist — 11 Jun 2008 @ 4:16 PM

  225. Ray Ladbury (208) — Bayesians make quite a point of ‘background information’. In the current seeting of estimating equilibrium climate sensitivity, that means we may use all of physics except those observations to be used to update the prior. So we are allowed to redo Arrhenius’s calculation correctly to obtain a prior equilibrium climate sensitivity of about 3 K. Now choose to wrap a distribution around that, a Weibull distribution for sanity and safety. There is the issue of the remaining parameter; use the median of the distribution together with weak anthropic principle (or whatever).

    Now update the prior using paleoclimate data. The result should be quite good, IMO, but still leaving very small probabilities attached to large values. So, collect more data and repeat.

    Comment by David B. Benson — 11 Jun 2008 @ 4:44 PM

  226. Ron Taylor (223) wrote “Stop greenhouse gas emissions totally today and the temperature (say the ten year mean of the annual global average) is expected to continue to increase for at least decades.” I would prefer using at least 22 year averages, but that is a small point. The larger one is that for the oceans to warm to come close to equilibrium with the air requires several centuries.

    My own amateur estimate is that if there were no further emissions after today, there is about 0.5+ K of further global warming to come. That’ll melt some, but not all of, the Greenland ice sheet.

    Comment by David B. Benson — 11 Jun 2008 @ 4:56 PM

  227. Ray Ladbury: –the prior dominates the results because there is not enough data to overcome its influence.

    I don’t think resolving that was the point of the paper (since it only used ERBE data). There is independent data available from paleoclimate and volcanic effects that could be used for the prior.

    Ray: And yes, while Raypierre’s comment does not completely rule out, it’s bloody difficult to make a model with a sensitivity that low–and if you did, you’d probably have some pretty unphysical assumptions or results.

    I still have not seen any real evidence to support that statement. The only expert opinion we got did not support it.

    Apparently a slightly negative (or even zero) cloud feedback would probably allow sensitivity less than 2C. From the published info that I have seen, that is within current uncertainty.

    Comment by Steve Reynolds — 11 Jun 2008 @ 5:50 PM

  228. RE #217 GO: http://www.westport.com and read about the clean Diesel technology they have developed.

    Methanol has a much lower energy density than gasoline. So what? The low price compensates for that. However, if it were to be used as fuel, the gov’s would slap a lot of taxes on it.

    But in the meantime, suppose you have your engine modified for use with methanol just like the racing cars,
    and you have the drums of methanol delivered right to your doorstep from a commercial suppler of this “solvent”. Whose is to know that you are using it as motor fuel? If really want to juice it up, you can add some propylene oxide which is also readily available and cheap.

    You could a become methanol “bootleger”, get really rich, and then drive a big honking Dee-Lux Lincoln Navigator or the Viper with that monster V-12 engine! And then watch all the neighbors go nuts!

    Comment by Harold Pierce Jr — 11 Jun 2008 @ 6:52 PM

  229. My own amateur estimate is that if there were no further emissions after today, there is about 0.5+ K of further global warming to come. That’ll melt some, but not all of, the Greenland ice sheet.

    Really? 0.5 degrees is all that separates Greenland from becoming green again? I’m sure someone on a long betting site would take you up on that.

    Comment by BlogReader — 12 Jun 2008 @ 1:08 AM

  230. RE#228 Harold…how much road tax is there on methanol?
    And tell me what are the drawbacks to methanol if you spill it on the environment?
    And what happens if you spill methanol on yourself?

    Comment by Tom G — 12 Jun 2008 @ 1:20 AM

  231. The problem with biofuels will always be, that people will start growing fuel rather than food and that will automatically lead to less food and more expensive food which will make the poorest people in the world suffer even more just because we focused on cars rather than looking at the entire energy sector and picking the most effective measures first – which clearly are replacing coal and gas power plants with renewables and nuclear. Cars are a popular and telegenic focus. Politicians driving around in Teslas and activists slicing SUV tires – all for the “greater good” but somewhat pointless and of a rather symbolic nature compared to the real issues.

    Comment by Henning — 12 Jun 2008 @ 1:55 AM

  232. @Secular Animist #224
    Those numbers are misleading. Wind contributed 5% in 2007 with an average effectivity of just 20% of the installed capacity. Schleswig Holstein is an exception because its the small part directly south of Danmark (a narrow, flat strip of land) and its effectivity is considerably higher due to no big cities, very little industry and rather constant wind conditions. But even there, 100% is laughable. They will need backup. The 25% claim for PV is a joke that went thru the press a couple of months ago. Somebody had simply looked at the installed PV capacity and had assumed exponential growth – exactly the sort of science usually and rightfully smashed on this website. The “Ausstieg” from nuclear will not persist, BTW. The conservative parties most likely to rule the country after the next elections have already announced that they will go back to nuclear rather than building new coal power plants which the old government had planned just to get rid of nukes.

    Comment by Henning — 12 Jun 2008 @ 2:21 AM

  233. Re #224, very true and well put but the world needs 7 TW more power come 2030 and renewables although set to expand aggressively will not even make up this number from the 14 TW We use today.

    There is little chance of avoiding 2C but a good chance of averting anything more unless CCS does not work and we still intend to go coal mad which even Germany is planning on doing.

    The complexities of the energy mix requirements for heat, electricity and liquid fuel is all underpinned by liquid fuel. Global warming will not be a worry to mosat if we see the $250 oil barrel as large part of the economy will be wiped out as well as globalisation etc.

    Comment by pete best — 12 Jun 2008 @ 3:56 AM

  234. #230 What happens if you spill methanol on yourself is nothing. Bathing in it however is not recommended and drinking it is deadly. I speak from experience on the first and intelligence on the latter two

    Comment by Eli Rabett — 12 Jun 2008 @ 7:10 AM

  235. Thanks all, for the education,

    Ron Taylor – What do you have in mind when you say act now? My car gets good mileage, I’ve just re-insulated my home, Use CF bulbs where I can stand to, have modern and efficient appliances and all that. I’m a firm supporter of nuclear, and hydro power (wind and solar wouldn’t make a dent in the NorthEast’s needs). I’d love the ability to buy a diesel – electric car with a diesel generator running at steady RPM driving electric motors because it would eat silly hybrids and fuel cells for lunch at 50 – 100mpg because I do believe in conservation.

    But what do you mean, or what do Real Climate regulars mean, by act now?

    Comment by DR — 12 Jun 2008 @ 8:03 AM

  236. Re: Bayesian analysis, Priors and Sensitivity. Bayesian analysis is easy when you have lots and lots of data–then your choice of prior doesn’t matter. However, in the realm of limited data–where we usuallylive–your choice of prior effectively determines your result. One approach I’ve discussed with a colleague is doing the analysis over “all possible priors”. Then you can look at a weighted average to get your best-fit and how things change over the choice of prior. In reality, of course, you have to choose a subset of priors as possible, and you have to figure out how to weight them (weighting them all evenly is in fact akin to a “maximum entropy” prior on priors), but it is probably still an improvement.
    The thing is, the advantage of Bayesian analysis is that it allows you to include all sorts of information that couldn’t be included in a standard probabilistic analysis, and so get more definitive answers than you could otherwise. Unfortunately, there are few generally accepted guidelines as to HOW TO include this information.

    My general caveat is that when your choice of prior influences your results significantly, you need to be very careful. Since this is almost always the case (or why else would you be using Bayesian methods), caution is the watchword.

    Comment by Ray Ladbury — 12 Jun 2008 @ 9:12 AM

  237. DR, “Act now” means do whatever we can to reduce CO2 emissions, so we can hopefully buy enough time to come up with a real solution to the issue. It means trying to convince others of the necessity of such action. It means voting for “reality-based” politicians who recognize the necessity of decreasing CO2 emissions. It means realizing that this is a global problem that invalidates the usual economic zero-sum game model. Progress in China or India or Africa is as much a gain as progress in the US.
    It means planting trees to buy time and a whole lot more. Above all, it means banishing complacency. The more progress we make reducing CO2, the more time we have to come up with a workable solution. Perhaps our great-great-great grandchildren can afford to look behind them.

    Comment by Ray Ladbury — 12 Jun 2008 @ 12:13 PM

  238. BlogReader (229) wrote “0.5 degrees is all that separates Greenland from becoming green again?” No, just that more of the Greenland ice sheet would eventually melt. Remember, that is the global temperature change, distributed unequally between the tropics and the polar regions.

    My take on the Greenland ice sheet is that it is already in negative mass balance. Somewhere earlier on RealClimate (I think) there is a link to a paper simulating the extent of the ice sheet during the Eem (Eemian interglacial period). Although considerably warmer than now, about 2+ K, some of the simulated ice sheet survived.

    In any case CO2 emissions didn’t stop yesterday, aren’t stopping today, and won’t tomorrow. It’ll become even warmer.

    Comment by David B. Benson — 12 Jun 2008 @ 4:54 PM

  239. Henning (231) — Not entirely. First of all, there is a considerable (to put it mildly) supply of biomass as biofuel feedstack in the form of waste: agricultural, animal, forestry, and with now half the people of the world living in cities, municipal. This alone will make a substantial contribution.

    More, in the global South especially, there is an abundant supply of unused, sometimes degraded, soils. Many plants suggested as biofuel feedstacks will thrive under those conditions where typical plants for human and animal food will fare only poorly. So there is no competition with food by using those areas and plants.

    Then there are the semi-competitive plants. An example are tubers such as sweet potatoes and cassava. People prefer ‘better’ foods, but the tubers will grow on poorer soils. Using the tubers for biofuel feedstock when the food crop does well makes good sense; if the food crop (partially) fails, then eat the tubers. It is a form of insurance.

    Finally there is the directly competitive crops such as maize, rapeseed, soy and palm oil. Ethanol from maize doesn’t even help the climate. (Ethanol from sugarcane does.) Biodiesel from rapeseed is hardly better. Both, in an ideal world, would be discouraged. That leaves, in this brief analysis, the competion between food and fuel for major oil crops such as soy and palm. With regard to palm oil, it is now in such great demand as food that the biodiesel producers in Southeast Asia are having serious difficulties obtaining affordable supplies. A ‘free market’ proponent would surely find this to be an appropriate situation. I’d rather see saner policies which don’t lead to so much waste of resources, myself.

    Comment by David B. Benson — 12 Jun 2008 @ 5:16 PM

  240. DR, “Act now” also means writing letters to the press and answering posts on discussion forums that claim climate change is a hoax. The will to act at the top level of politics is ultimately driven by the fear of losing votes; a small but vocal climate inactivist community can give politicians the insulation from reality that permits them to avoid unpopular steps like encouraging everyone to use trains rather than cars (e.g. by ripping out roads where trains are a viable alternative). This is a useful prerequisite to emissions reduction because any reduction in emissions from power generation is immediately effective for trains; for cars it only helps if we all use electric cars (and likewise buses).

    Comment by Philip Machanick — 12 Jun 2008 @ 11:09 PM

  241. DR – I have done the things you have done. By act now, I mean that, as a society, we need to dedicate ourselves to the systemic change that is needed to address this issue. I am talking about governement policy, not the pandering, short-term politically expedient stuff, but the kind of long-term commitment that the problem demands. It means changing our approach to transporation, electric power generation, etc. And that will not happen without a comprehensive commitment to research, tax incentives, and everything else the goverment can do to make this happen (without taking it over and thus killing it).

    Comment by Ron Taylor — 13 Jun 2008 @ 6:34 PM

  242. DR writes:

    wind and solar wouldn’t make a dent in the NorthEast’s needs

    And you know this how? Based on what?

    Comment by Barton Paul Levenson — 13 Jun 2008 @ 7:46 PM

  243. @Davis B. Benson #239
    In a perfect, sane world, people would grow fuel exclusively on soil not appropriate for food and there would probably be laws out there ensuring a worldwide usage of soil for food until everybody had enough to eat and nobody would suffer. But what we see every day is that people do what’s best for them as an individual. And you can’t blaim a farmer for planting fuel rather than food when he can make more money that way. Suddenly we fueling our cars compete directly with those wanting to eat and already having trouble to afford it. This won’t save the world – it will split it even further. I’m totally with you when it comes to using waste that would otherwise just rot in the sun and maybe even when it comes to exploiting soil that wouldn’t be used at all if it wasn’t for biofuel – but I don’t see how this can be ensured at all. It didn’t work against the insanity of people planting drugs for the rich in the poorest parts of the world, replacing food for many with money for few – and those are illegal. I don’t want to find out what it’ll be like for something that isn’t.

    Comment by Henning — 14 Jun 2008 @ 6:24 AM

  244. DR wrote: “… wind and solar wouldn’t make a dent in the NorthEast’s needs …”

    On the contrary, the offshore wind power resources of the Northeast alone could “make a dent” in the entire country’s needs. According to a September 2005 article in Cape Cod Today:

    There is as much wind power potential (900,000 megawatts) off our coasts as the current capacity of all power plants in the United States combined, according to a new report entitled, A Framework for Offshore Wind Energy Development in the United States, sponsored by the U.S. Department of Energy, Massachusetts Technology Collaborative, and General Electric. The Framework report finds the greatest wind power potential offshore the highly-populated urban coastal areas of the northeast … Offshore wind energy is also an attractive option for the Northeast because slightly more than half the country’s offshore wind potential is located off the New England and Mid-Atlantic coasts … As the Northeast seeks indigenous alternatives to oil and natural gas, offshore wind is the most promising option …

    A good source for information on wind energy potential in the Northeast is the New England Wind Forum on the US Department of Energy website.

    Comment by SecularAnimist — 14 Jun 2008 @ 11:40 AM

  245. How much wood would a woodchuck chuck if the woodchuck had a contract to fuel the number of power plants (displacing coal/natural gas burners) necessary to supplement power when the wind/sun fail to do the job?

    Comment by JCH — 14 Jun 2008 @ 7:36 PM

  246. re #239 and 243

    Don’t you think that your comments relaing to production of biofuels on “unused or waste land” are somewhat anthropocentric? We do share our planet with other species. We have already appropriated vast areas of the globe for our agricultural activities and pushed many other species to the edges. If it is deemed necessary to sustain an ever growing human population in the face of the twin threats of peak oil and global warming, we should make more efficient use of the land we do farm. By eating much less meat, we would probably free up well over half the area we currently use for the production of biomass. I say this with regret because I am an enthusiastic carnivore with a repugnance of rabbit food.

    [Response: There is a strong case for wilderness untouched by humans, but in the spirit of Aldo Leopold, it is also possible for human uses of land to co-exist with the needs of other species. Industrial agriculture doesn't do that, but that is not the only model of land use. With regard to biofuels, there's increasing interest in natural prairie ecosystems as a source of feedstock for cellulosic ethanol. --raypierre]

    Comment by Douglas Wise — 15 Jun 2008 @ 3:31 AM

  247. JCH has an interesting point — is wood-burning a viable energy source?

    Comment by Barton Paul Levenson — 15 Jun 2008 @ 6:02 AM

  248. re Raypierre’s response to #246

    I agree that it is possible for some human uses of land to co-exist with the needs of (some) other species. However, your apparent readiness to damn so-called industrial agriculture in favour of a more sustainable organic approach is IMHO somewhat naive. The latter model would/will serve us well when and if global population levels drop to 2-3 billion and if we still have a reasonable climate by then. It is how you get there from here that should be exercising everyone (i.e. from 6.5 to 10 back to ,say, 3 billion).

    I accept that industrial agriculture is energy intensive but it is the very use of all this extra energy that has allowed massive yield increases which, in turn, has resulted in what would otherwise have been unsustainable population growth. I am led to believe that the green revolution doubled the world’s carrying capacity. A sudden widespread reversion to organic farming would lead to Malthusian consequences although, in the West, the affluent appear to acquire feelings of virtue from the purchase of over-priced organic produce.

    The only way we may be able to have a soft landing for the planet and its human population (assuming you physicists have your climate sensitivities properly calculated) is to work as rapidly as possible on acceptable methods of population control while maintaining high but ultimately declining levels of industrial-scale agriculture.

    I suggested that, if you want more bangs for your buck, you might have to consider changing human dietary habits from a high dependency on meat. This would require top chefs and food writers to start developing meals from vegetables that the carnivores among us could find remotely palatable instead of banging on about the mythical health and taste virtues of organic produce. Currently, over half of all agricultural land is grassland, largely but not entirely because it is relatively less suitable for arable production. Much of this could be given over to forestry or other forms of biomass.

    The human digestive system is designed to handle diets similar to those of pigs and poultry. Generally, vegetable diets are lacking in some of the essential amino acids needed by monogastric animals. Traditionally, these limiting amino acids are provided through meat, fish, eggs or dairy products. In productive poultry or pigs, up to 7% of old fashioned diets comprised meat or fish meal. On economic grounds, these have mainly been replaced with synthesised limiting amino acids with no associated performance loss. It should also be noted that pig and poultry diets are based on cereals and vegetable protein concentrates (solvent extracted soya bean meal, for example) with additions of appropriate mineral/vitamin supplements. Monogastric farm animals are not fed salads (flavoured water with a tad of minerals and vitamins)or even green winter vegetables. It follows that, if the name of the game is to maximise the number of humans that can be sustained for a minimum of agricultural production, feed them all poultry or pig diets and get rid of the poultry and pigs. Replace ruminants with trees, harvesting the latter for (transport) fuel and CO2 air capturing biochar while simultaneously reducing methane emissions.

    O, brave new world. I’ll soon be out of it.

    Comment by Douglas Wise — 16 Jun 2008 @ 4:07 AM

  249. Excellent post by Douglas Wise. Appealing ideas are not the same as *effective* ideas. Low yield “organic” agriculture, like most alternative power scenarios, simply cannot (or won’t) ever scale to meet even fraction of the growing global demands. An exceptions and therefore a major part of the solution is nuclear power which scales up very well.

    Comment by Joseph Hunkins — 17 Jun 2008 @ 5:29 PM

  250. The claims that Mars and Jupiter are warming up gets lots of play, especially from “conservative cartoonist” [Mallard Fillmore.] What’s up with that, what is the science and the not-science there?

    Comment by Neil B. — 17 Jun 2008 @ 6:26 PM

  251. Neil B., Consider that Mars is about 1.5 times further from the sun than Earth. Jupiter is over 5.2 times as far. There is a climate model for Mars–and one of the biggest factors is prevalence of dust storms, which block the sun. Guess what. The period during which they say there was warming is free of these storms. Factor this in, and the warming is explained according to the model.
    In the case of Jupiter, most of its energy comes from within the planet, not from the Sun.

    Anyone using this argument is either ignorant or a fraud (I will let Richard Lindzen and your cartoonist decide which represents the lesser charge).

    Comment by Ray Ladbury — 17 Jun 2008 @ 8:26 PM

  252. Neil B., click “Start Here” at top of page,
    page down to the links under the heading:

    “Informed, but seeking serious discussion of common contrarian talking points:

    “All of the below links have indexed debunks of most of the common points of confusion….”

    You’ll find the Mars/Jupiter stuff there.

    Comment by Hank Roberts — 17 Jun 2008 @ 8:32 PM

  253. 2000 EPA carbon dioxide plan. Can anyone help?

    In response to the year-old Supreme Court order directing the EPA to regulate carbon dioxide emissions, the EPA plans to open up a listening session and send the issue out for public comment before any draft regulations are proposed. It is contrary to EPA’s mission to suggest that they don’t know what to do. EPA’s job is to write detailed regs and send them out for public comment.

    In December 2000 (?) EPA had an outline of proposed CO2 regs on their web page. I didn’t print it because that was before EPA scrubbed information from their web site. Now the site contains notes that certain documents are not updated but are kept for archival purposes. Not available.

    Does anyone have this information? My vague recollection is that it might have been a powerpoint pdf. I think it would be useful to resurrect it and point out that EPA does, indeed, know what to do.

    Thanks.

    Comment by Laine V — 23 Jun 2008 @ 8:33 AM

  254. Re #253

    Try googling for the wayback machine, I’m unable to post the url because this site thinks it’s spam.

    Comment by Phil. Felton — 23 Jun 2008 @ 10:56 AM

  255. minor correction, Laine: I think the Court declared that the EPA had the authority to regulate CO2 emissions, not that they had to. (and FWIW, even with that was not the first time by a long shot the Court was wrong.)

    Comment by Rod B — 23 Jun 2008 @ 12:29 PM

  256. “…the Court was wrong” (relative to EPA having the authority to regulate CO2 emissions).

    Oh brother. That is absolutely preposterous and fundamentally wrong. There is absolutely no basis whatsoever for that statement. EPA has had the authority to regulate specific pollutants for decades. Based on scientific studies, specific criteria are set for sulfur dioxide, carbon monoxide, fine particles, ozone, lead and nitrogen oxides. See http://www.epa.gov/air/criteria.html. And contact your local state environmental agency for details. You may learn something about “non-attainment” areas.

    Comment by Dan — 23 Jun 2008 @ 1:18 PM

  257. re: 256. Correction. Oops, wrong URL. That should have been http://www.epa.gov/air/urbanair/.

    Comment by Dan — 23 Jun 2008 @ 2:08 PM

  258. Dan, your own good reference specifies the specific pollutants that the EPA is charged with managing and regulating under the Clean Air Act and revisions. You’ll notice CO2 is not one of them. It is true that the EPA can and has selected other things to monitor, CO2 being one, but they need to get some legal authority/directive to manage and regulate (in most cases) other things — which they have done in the past, but not for CO2. Those additional things were substances that were very similar to the “pollutants” as described in the legislation like radon or most being like the loose-goosey “particulate matter”.

    At least that is how the EPA saw it; so that must make them also “absolutely preposterous and fundamentally wrong.” The Court, in my opinion, simply thought it sounded like a good idea and ordered the EPA to look at it — Clean Air Act et al having little to do with it. Though that admittedly is a bit overstated as, as my main point said, the Court ordered the EPA to assess regulation, not to actually do it, which, if you’re looking for tiny loopholes (or as Chief Justice Roberts in dissent called it, a “sleight of hand”), was not totally contrary to legislation.

    Comment by Rod B — 23 Jun 2008 @ 11:09 PM

  259. Rod B, Section 160 of the Clean Air Act requires: “to protect public health and welfare from any actual or potential adverse effect which in the Administrator’s judgment may reasonably be anticipate to occur from air pollution or from exposures to pollutants in other media…”
    As climate change certainly entails adverse effect to public health and welfare, the cause of that change needs to be regulated per this provision.

    Comment by Ray Ladbury — 24 Jun 2008 @ 8:27 AM

  260. re: 258. You are using circular reasoning now re: the Clean Air Act having nothing to do with it. Wrong. The legal precedent is set by the Clean Air Acts (there have been several amendments). Section 160.

    BTW, “particulate matter” is not “loose-goosey”. It is called “science”. There were specific, updated scientific studies related to public health and welfare exposure which went into the setting of the particulate matter criteria standards. In fact, the fine particulate standard was recently revised based on updated health studies.

    Comment by Dan — 24 Jun 2008 @ 9:34 AM

  261. Ray, the anomaly is that CO2 has never been viewed as “pollution” by the commonly, scientifically, and long-held definition. It was only through the dummying down of the definition that the advocates could sneak through the now new loophole and include CO2. It was proclaimed loud and long enough that it pretty much took and CO2 is now, without a lot of discriminating thought, kinda accepted as a pollution. But it was not meant nor contemplated by the Act.

    Dan, “nothing to do with it” is probably an editorial exaggeration, but it is the thinking of the Justices that I am referring to, not the Act(s) itself. Acts set legalities. They do not set precedent per se; only when the judiciary wants to expand the law to what they would like can law be viewed as a precedent.

    By loose-goosey I simply meant that it can cover a multitude of sins vs. the specificity of the other pollutants. It can be misused IMO but is probably a reasonable flexibility.

    [Response: I'm not a lawyer, but the contortions of the language of the act made by advocates for the "CO2 is not pollution" crowd were very weak. The CAA has clear language (sections g and h) defining a pollutant to be something that is emitted into the air and can affect welfare - specifically air quality, water quality or climate. You have to be a medieval theologian to argue that this does not include CO2. - gavin]

    Comment by Rod B — 24 Jun 2008 @ 10:26 AM

  262. Hello again,

    Here’s a link to the Court’s decision. Although it’s lengthy, the decision is summarized in the first few pages.

    There are probably things we could debate but this sentence says a lot to me: “Because greenhouse gases fit well within the Act’s capacious definition of ‘air pollutant’, EPA has statutory authority to regulate emission of such gases from new motor vehicles.” The precedent is set. The Court rejected the EPA’s argument that carbon dioxide is not a pollutant, an argument based on its not being a traditional i.e. toxic air pollutant in the usual sense. The Court also rejected the EPA’s argument that this is not a good time to regulate GHGs.

    Can EPA seriously claim that they need pre-proposal comments because they don’t know what to do? It’s an invitation to those with the time, money and awareness to come forward and write the regs. John/Jane Doe aren’t going to respond to this request yet they are still entitled to EPA’s protection and oversight. EPA can hire anyone to compile comments but it’s my understanding that the agency employs scientists and engineers for a purpose.

    This unusual regulatory procedure also slows the process down considerably. I’m sure everyone here is aware of Dr. Hansen’s most recent congressional testimony. He works tirelessly to bring this issue to policymakers at great personal cost and his tone becomes more urgent with each appearance.

    Laine

    http://www.supremecourtus.gov/opinions/06pdf/05-1120.pdf

    Comment by Laine V — 24 Jun 2008 @ 4:56 PM

  263. Laine V (#253) There were no proposed CO2 regulations in 2000. The EPA could do that only after they made a formal finding that CO2 fit the definition of pollutant under the Clean Air Act (CAA). Considering that once a substance is listed as a pollutant the CAA requires it to be regulated, there would have been lots of attention and news stories.

    If there was anything on the EPA site it was probably just a page with information about the effects of greenhouse gases and discussion of possible regulations. Anything like a proposed regulation would be in the recorded in the Federal Register.

    # 261 In the Supreme Court case the issue was not about the question of CO2 meeting the definition of pollutant under the Clean Air Act. As Gavin correctly notes the language of the CAA requires CO2 to be listed as a pollutant.

    The EPA and its allies (auto makers, conservative political groups, et al) in this lawsuit never tried to argue that CO2 was not a pollutant because that position had no basis and would never work. The EPA argued that there were other considerations that trumped the CAA, but this did not get much traction. The EPA also argued that for technical reasons the states and their allies (cities, enviro groups, et al) did not have the right to challenge the EPA’s decision in court. This argument did have some validity, enough to get a strong dissent, but not for the court to allow the EPA’s decision to stand.

    Comment by Joseph O'Sullivan — 24 Jun 2008 @ 6:41 PM

  264. Gavin, Neither am I a lawyer (though might be a “wannabe” ;-) ), but I recall a rule of lawyers that says the legal answer to any question is, “Maybe yes. Maybe no.”

    I’ll try to respond, but briefly so as to not totally mess up this thread. Is your reference an explicit implication that stuff like CO2 is something the Congress had in mind for possible regulation by the EPA? Or is it just a broad general almost throw-away statement similar to, ‘the purpose of the EPA is to regulate stuff going into the air and water’? How might it relate to an earlier (Sec 112, part b) passage that reads (paraphrasing with my emphasis), “The Administrator will review the [very long specific] list and….add pollutants which present through inhalation or other routes of exposure a threat of adverse human health effects (including….carcinogenic, mutagenic, teratogenic, neurotoxic, cause reproductive dysfunction, or are acutely or chronically toxic)….” Or the definition: “(6) HAZARDOUS AIR POLLUTANT.— ….any air pollutant listed [as above]”

    I can see how the answer is debatable and might go either way. But it seems relevant that, to the best of my knowledge and search, CO2 is not stated, mentioned, implied, nor hinted at anywhere in the statute, save one mention of the EPA looking at CO2 as a consultant in conjunction with a whole subsection that directs the National Academy of Science and the OSTP (specifically not the EPA) to conduct a broad study of CO2 — and gave them a whopping $3,000,000 to do it. Doesn’t seem to be an even minor focus or intent. (Though I recognize understand some in Congress would like it; they just did not get it in the Act.)

    Laine repeats my assertion — the Court didn’t like the way CAA read and decided what they would like.

    Sorry for all this legalese, but with Hanson wanting to criminally indict the CEO of Exxon et al, it’s starting to get significant.

    To all of you salivating over the prospects of the EPA directly regulating CO2 emissions (not to mention methane, nitrates, and water emissions coming right behind), I would respectfully suggest you be very cautious and think through what you wish for, as the Chinese say.

    Comment by Rod B — 24 Jun 2008 @ 10:56 PM

  265. Rod B, I think you are applying the strictures of constructionist constitutional law to an act of Congress. This is invalid, since Congress chose to include language sufficiently broad that it could be applied to pollutants not identified at the time the act was written. If it affects public welfare, it is a pollutant–and no amount of propaganda claiming “CO2 is fertilizer” will change that. I would contend that it’s the propaganda that is fertilizer.

    Comment by Ray Ladbury — 25 Jun 2008 @ 7:07 AM

  266. Joseph (#263)
    Yes, I know the procedure. There were no proposed regulations. When I made my request for information, I described it as an outline for lack of a better word. My point was to challenge the notion that the procedure you correctly describe cannot be followed in this case because EPA needs prior public input before writing regs. The document I vaguely recalled would have been useful in this regard but is certainly not necessary to support my premise. In any case, I appreciate all responses and recognize that this is primarily a science page and a valuable one. I wasn’t attempting to hijack the thread with politics. Cheers.

    Comment by Laine V — 25 Jun 2008 @ 8:01 AM

  267. Laine V try the Warming Law Blog. The lawyers there have insightful commentary on the legal/political issues. They have a recent post that might help with your request for information.
    http://warminglaw.typepad.com/my_weblog/2008/05/insulted-epa-fi.html

    #265 (Ray Ladbury), #264 (Rod B.) et al
    The Clean Air Act contains broad powers to regulate substances emitted into the air and gives the EPA little discretion on when to regulate when certain conditions are met. A plain reading of the entire law demonstrates this. Its a long established legal principal, very dear to conservative judges and legal scholars, that when a statute is clear then discussions of Congress’ intent or the wisdom of the statute are irrelevant.

    In Mass v EPA the only real question was whether the plaintiffs had the right to go to court to overturn the EPA’s decision. The “sleight of hand” mentioned by Justice Roberts was addressing the majorities expansion of the right of state governments to challenge agency decisions in court.

    This is my last comment on this subject. I’ll claim res judicata on any further discussion.
    http://en.wikipedia.org/wiki/Res_judicata

    Comment by Joseph O'Sullivan — 25 Jun 2008 @ 1:26 PM

Sorry, the comment form is closed at this time.

Close this window.

0.754 Powered by WordPress