I am an english professor teaching a course on global warming using some of the most accessible material I can find. anyway, we are reading crichton’s book and the main paper I wish them to write on crichton is a response to the science claims in the book. I’ve found much useful on this site for just this purpose so I want to thank you.
so that’s a comment and here’s a question: skimming thru essex and mckitrick, I recall that they claim that global temperature doesn’t exist. claims like this for ordinary people are so stunning as to cause temporary paralysis.
I’d like your take on this sort of comment(assuming I’m correct about e and m). . surely, it involves the confusion of local and global you have mentioned many times. In fact, I think they verge on saying there is no climate, only weather; no global, only local and regional. playing amateur sociologist of knowledge, I think some of this denial of the global comes from their general philosophical commitment to free markets, assumptions which tend to view only the individual as real, making larger institutions or “systems” metaphysical (an illusory abstraction).
but if they’re going to say something like this, I’m wondering if they are consistent. do they ever chime in to admit that there is warming but that it’s caused by natural variability? what DO they say about increasing greenhouse gases in the atmosphere? can they even cite richard lindzen as an ally since Lindzen believes in the “atmosphere,” and it seems plausible to me that e and m have no right to this concept. if there is no global temperature, there is no global atmosphere either–merely atmospheres (local).
This skeptics debunking of “the global” as a “reification” (I think they actually use this term that used to be reserved to marxists) strikes me as similar to creationists denying “methodologicall naturalism.” It is a “debunking” of something so fundamental that it makes one’s jaw drop–as I noted above, using paralysis rather than drooling as a metaphor.
anyway, I rarely see responses to this specific clam of the skeptics (which I’ve only actually seen in essex et al). is it because it’s so stupid? or because it’s a special case of the local/global mistake?
[Response: I'm not sure the E&M claim makes any sense. Global sfc t obviously does exist. The only way to make sense of it is to interpret them as saying that sfc t is not conserved, unlike say heat content (which isn't conserved either, but changes in it represent energy flows in and out). The problem is that for most purposes (fluxes of heat into the oceans, and hence ocean warming and hence sea level rise; or biosphere responses) what you care about *is* the surface temperature. Why is why everyone pays attention to it. Plus its best-measured. If you're looking for more debunking skeptic nonsense, then you can also try this and this - William]
I’d be interested to see your line-by-line critique of the M&M replies. Perhaps this will help put some of these long-contested issues to rest?
[Response: Right now we are just observing - it is probably for the commenters to respond to the replies should they feel that is necessasy. There are still a number of papers going through the works so we will probably wait for the dust to settle... -gavin]
Comment by nanny_govt_sucks — 24 Oct 2005 @ 1:54 PM
Re #2: Make sure to cover Spencer & Christy’s satellite data come-down (covered in a recent post here) and the fresh paper marking the end of Lindzen’s “iris effect” (very new so still not analyzed in any public spot I’m aware of), abstract at http://www.sciencemag.org/cgi/content/abstract/1115602v1 .
As near as I can make out Essex and McKitrick are going back to a basic principle of thermodynamics, that temperatures are intensive quantities that cannot be added (or averaged, it comes to the same thing). For example given a pot at 30 C and a bowl of water at 20C, you can’t average the temperatures to find what the final temperature will be when you pour the water into the pot. The ENERGY of different parts can always be added (or averaged), and the final temperature deduced (General Chemistry I, General Physics I).
For homogeneous sytems (the atmosphere, for example) there is a simple linear relationship that relates the energy of any part of the system to the temperature, for example, by volume E = V C T where V is the volume of a packet, C the specific heat per unit volume and T the temperature. This means that for the homogeneous atmosphere, you CAN add the temperatures, divide by the total volume and get a meaningful average temperature. That average is simply related to the average energy of the system as long as the specific heat is constant with respect to the range of the temperature variation. This is the case for the atmosphere.
If one were being pedantic, one would say that the average surface temperature is linearly proportional to the average energy content of the atmosphere at the surface and that changes in the average surface temperature proportionally track changes in the average energy content of the atmosphere at the surface. As far as I can see, E&M do not recognize this.
Steve, there seems to be some discrepancy with the Chen ea. findings, whom used the same satellite data (HIRS) for the upper troposphere humidity (UTH). They found a drop of 0.2% of UTH near the equator and 0.15% in the subtropics over the period 1985-1994. In the same period, cloud cover reduced with -1.7% and -0.33% for the same regions and sea surface temperature increased with 0.083 C. This is reflected in radiation trends of 5 W/m2 IR radiation back to space and 2 W/m2 less reflection of sunlight for the period 1985-2000, which results in 3 W/m2 net loss to space. According to Chen ea., the difference is not due to changes in clear sky radiation (too small, which may point to small differences in water vapour column), but in cloud cover.
A significant difference further is that the radiation balance (see Wielicki, fig. 2) shows very large changes for both the 1992 Pinatubo and the 1998 El Nino. The latter can be seen in the water vapour trend of Soden ea., but the Pinatubo cooling (0.5 C with a corresponding drop in water vapour) seems to be underestimated.
Lindzen may be wrong by looking at too small scale (tropical storms), but the overall tropical radiation balance (30N-30S) seems to respond to a warmer climate by emitting more heat to space…
Comment by Yelling in the Fog — 25 Oct 2005 @ 10:04 AM
thanks to both cody and eli for helpful comments.
tim lambert says the following in response to e and m:
“In their briefing, Essex and McKitrick claim that physics provides no basis for defining average temperature and:
‘In the absence of physical guidance, any rule for averaging temperature is as good as any other. The folks who do the averaging happen to use the arithmetic mean over the field with specific sets of weights, rather than, say, the geometric mean or any other. But this is mere convention.’
Physics does, in fact, provide a basis for defining average temperature. Just connect the two systems that you want to average by a conductor. Heat will flow from the hotter system to the colder one until the temperatures are equalized. The final temperature is the average. That average will be a weighted arithmetic mean of the original temperatures. Which is why the folks doing the averaging use weighted arithmetic means rather than the geometric mean.”
Eli: is this consistent with the answer you gave me? forgive my ignorance but as I tell my students, the secret to learning is to get over the fear of looking stupid. I have some questions about your pot and bowl example and so would like your email if you wouldn’t mind: but is this example a good analogy for the atmosphere? Is the relation between regional temperatures analogous to water temps in pots and bowls? (I realize the analogy was meant for another purpose)
on the lambert website, while others raise the thermodynamic explanation of e and m’s claim, lambert himself focuses on e and m’s claim that statistical techniques for interpreting data can give you incompatible results, the same data interpreted as so much warming in one framework and so much cooling in the other.
This appears to be a skeptical argument about the use of statistics.
I assume this website is for the general public. are there prerequisites? how ignorant are we permitted to be (in my view, the answer should be “pretty damn ignorant” if you want the website to make sense to ordinary people). can people recommend a “global warming for dummies” type thing that can teach some of the rudimentary physics? much of the problem here is science illiteracy.
gm (which could be greg meyerson or our proverbial grandmother)
[Response: I don't think the E&M stuff is very interesting, but I seem to have written a post on it (well actually, on why no-one is interested in it :-), and since this is self-promotion day, its here - William]
I believe that what Tim was trying to show was that you could manipulate the statistics but that if handled properly they would show warming. For example, if I recall correctly, Tim found that one method that was used my M was to “pad” the data with zeros where no data existed. This would obviously introduce a false cooling into the data.
so then m and e’s mistake about global temp is that they treat the atmosphere as something other than a homogenous system? meaning that–in terms of your example–they treat homogenous systems as bowls and pots (filled with water)?
The hockey stick discussion, and its elegant exposition recently notwithstanding, I am at a loss why social scientists of a variety familiar to most universities have not addressed the implications of the climate record upon human (mostly European) bahaviour, since 1000.With some exceptions they seem innocent of the developments in reconstructing past climates. The final graph of Gavin et. al, taken from the newer paper of Moebius et. al. which they cite is the single most tantalizing physical record of the Northern Hemisphere I have ever stumbled on ..anybody out there with a social science orientation wish to address this ?
writing from Columbus (edward lanwermeyer)
Comment by edward lanwermeyer — 25 Oct 2005 @ 1:07 PM
Re #16 (GM): Bowls and pots of differing and unknowable volumes, for that matter.
Re #11 (FE): No one questions that hotter tropics will result in increased heat flow into space; the issue is how much. Ferdinand, I started having a look at the relevant papers, and noticed a couple of things: Soden was a co-author of the 2002 Wielicki paper you cite, in 2002 Soden was lead author of yet another paper in Science, this one focused on the effects of the Pinatubo eruption, Wielicki and Wong (also an author of the 2002 Wielicki paper) were in turn co-authors of a 2003 IEEE paper debunking the iris effect, and… how in the world can so many scientists, many of them frequent collaborators, screw up something this fundamental over such a long period of time and have most of it get through peer review in the same prestigious publication? I’m just speculating here, but assuming for the sake of argument that there is some contradiction between the 2002 Science papers and the new Soden Science paper, did it occur to you that maybe the science has advanced a bit in the 3-1/2 years between their publication dates? I think you need to do some more homework. (I find Google Scholar very useful for making sure I locate all the relevant papers.)
If it’s ice hockey, the game might get called off due to thaw.
Comment by Lynn Vincentnathan — 25 Oct 2005 @ 4:40 PM
Mostly for Greg Meyerson: Our different examples are exactly the same. However, the final temperature of the systems brought into contact are not “average” temperatures. Formally, this equilibrium is the temperature at which the entropy of the system + surroundings is a maximum :(. Intuitively, we know that this is the temperature at which both bodies (pot+water) are at the same temperature, and this temperature is rather simple to calculate by equating the heat flow from one body to the other under the constraint that they reach the same final temperature.
OTOH, if the atmosphere is uniform, then you have a much simpler case where the energy of every unit volume is linearly proportional to the temperature in Kelvin, and this, like energy can then be added or averaged as you will.
Which brings me to the next point. The errors McKitrick makes about averages are much worse than Tim Lambert describes. Because the atmosphere is homogeneous any average of absolute temperature is meaningful. However, since the linear average is meaningful, the one can make the average using any linear temperature scale with an arbitrary zero (you could use Celcius, Fahrinheit, Rankine, or what you will) since you can always write T(Rabett) = T(K) + To(Rabett) for every temperature.
One cannot be so cavalier with other types of averages. For temperature scales where there are negative values in the data set, root mean square will not produce a meaningful result because negative and positive values of the same magnitude contribute the same positive amount to the average. If negative values are found in the data set the sign of the geometric average depends only on whether the number of negative values is even or odd. If there are zeros in the data set, the geometric average is ALWAYS zero and so on.
If you want to use these other sorts of averages, you MUST use a zero that is lower than any element of the data set, e.g. absolute zero, so there are no negative or zero values of temperature in the data set. Lambert shows that in those cases the various averages, although not identical are close. McKitrick does not do this. He does not even recognize the problem. His examples are not even wrong.
[moderator:Yes, you did. It was eliminated on the basis of our comment policy (see our discussion of "signal vs. noise"). Postings which seek to muddy the issues, rather than clarify them, are often eliminated, especially when the tone seems inappropriate. This comment was screened in to make a point. See below.]
There are 4 NH temperature (actual measurements) records which go back at least 200 years, i.e. Eastern US, Sweden, Central England and Armagh Observatory.
[moderator:This is just wrong. There are 9 that go back to 1777, 4 that go back to 1753. There are many reviews of the data by Jones, Bradley, and others. Please aquaint yourself with the facts before posting. Sometimes we will correct factual errors in comments, sometimes we will simply screen out the comments altogether if the net effect is to disinform our readers, as we don't have the time to correct every fallacy.]
None of these records exhibit any behaviour remotely similar to the hockey stick shape.
[moderator:This too is just wrong. In fact, the available instrumental temperature records, when formed into a hemispheric mean, closely match the reconstructions. See the review paper on "Climate Over Past Millennia" by Jones and Mann (2004) (figures 2 and 3). In fact, it is clear that the instrumental estimates run a bit warm relative to all of the proxy reconstructions (i.e., they suggest even less of a "little ice age" in the 18th and 19th centuries). This is likely due to problems with thermometer sun exposure, which is particularly pronounced during the summer at high latitudes, prior to the modern introduction of "Stevenson screens". There is much literature on this. You can start out with the review paper cited. ]
Don’t you think this is strange?
[moderator:What appears strange actually, is your willingness to post a comment without familiarizing yourself with the basic facts first. In the future, please make sure to familiarize yourself with facts before posting here. Thankyou.]
Re #8, let me clarify the difference between scientists and environmentalists. I assure you the hosts of this site are definitely scientists, though I heard of a Chicago scientist buying a SunFrost frig (1/10 the electricity of other frigs), so some might also be environmentalists in their private lives.
SCIENTISTS follow “THE SCIENTIFIC MODEL” of avoiding false positives (making a claim when it is false); that is, they do not make claims unless there is high confidence that they are right (usually 95% certainty), so as to protect their reputations; otherwise no one will believe them again (like the boy who called wolf).
ENVIRONMENTALISTS, those concerned about reducing harm to people and the earth, follow “THE MEDICAL MODEL” of avoiding false negatives (acting as if there is no problem, when indeed there is); that is, they do not need high certainty of a problem to start addressing it. It’s unimaginable for a doctor to tell a patient that there is only 94% certainty that the lump is cancerous, so they won’t operate. Some environmentalists even follow the precautionary principle of disallowing actions, unless they are proven safe –like not emitting GHGs beyond needs and simple pleasures, unless we are 99% certain AGW is not happening (the higher level of certainty is due to the risks being so grave).
I’m an environmentalist. I already figured AGW was happening back in 1990, 5 years before it reached 95% scientific certainty in 1995 (as did Pope John Paul II, who told us it is everyone’s responsibility to solve), and I started reducing my GHGs in cost-effective ways and am saving $$hundreds per year without lowering my living standard. Since evidence of GW has continued to pour in and become stronger & stronger (and since I’m saving heaps of money by abating it), I don’t think any contrarian or scientist can convince me to start emitting more GHGs, even if it becomes 100% certain AGW is not happening.
The HOCKEY STICK was a done deal for me back in 1990, well before the first articles came out on it. Regular GW is no longer my greatest concern. I’m on to worrying about RUNAWAY GW, or the VENUS EFFECT, our human-induced warming causing nature to start emitting more & more GHGs (CO2, CH4) in positive feedback fashion, perhaps leading to near extinction of life on earth, as happened from GW during the end-Permian extinction, and the other 4 great extinctions. I’m convinced we’re already into or on the brink of the 6TH MASS EXTINCTION, though we may not reach scientific certainty until itâ??s a done deal. Itâ??s possible we may not be able to reverse this, even with big GHG cuts, but I have to keep doing my best with the hope it’s not too late – and would keep doing my best, even if scientists finally do tell me it’s too late (hoping they’re wrong). That’s what it means to be an environmentalist (or Christian, or Buddhist, etc).
Comment by Lynn Vincentnathan — 26 Oct 2005 @ 12:24 PM
#8 was hardly deserving of an answer, but I’d draw a different distinction. Science is about figuring out how things work, and the usual test of understanding and usefulness is prediction. Success at prediction is enhanced by maximizing the number of different hypotheses (models) you can generate and test against numerical data and other available information. 95% confidence levels and the like are used as a sieve to reject the bad models, but that doesn’t mean there’s a 95% chance of being right. The curse of course is that 5% of the time you’re wrong randomly, and those results are disproportionately interesting, and thus likelier to be published. However, time eventually resolves such problems by providing more data to test out-of-sample predictive capability. It would be great if there were some way to update the proxies and test temperature reconstructions; that would really lay to rest (or not) all the accusations of spurious correlation etc.
Environmentalism, on the other hand, is more about values – do you care if polar bears go extinct? Do you prefer the mall or the forest? etc. Science and environmentalism may get intertwined if, for example, you prefer the mall but predict that environmental side effects might compromise the ability of industrial society to build more of them.
I see no real problem if many scientists are also environmentalists. If they were systematically biasing their own results through wishful thinking or more sinister means, a lot more of GW theory would have been refuted in the last 15 years.
As an aside, SunFrosts are perhaps 1/10 the energy of a lousy old fridge or perhaps a big commercial fridge, but the best ordinary top freezer consumer fridges are now very good; a SunFrost might use 2/3 the energy. It’s rather striking to me how many people will buy a $6000 SubZero or Northland of dubious efficiency without batting an eye, and how few will shell out less than half that for a SunFrost. Shows how far we are from actually confronting the issue.
Not wanting to rain on your doom and gloom. But, surely the fact that there is still life on earth despite previous huge changes in our atmospheric composition suggests that the Venus effect is a bad science fiction scenario rather than a serious possibility. Indeed, let’s just suppose that we wipe ourselves out with an event of similar magnitude to the end-Permian extinction. In your apocalyptic scenario that removes the cause of the atmospheric imbalance and the earth can go back to some more hospitable equilibrium – no? The fact that the earth has been capable of sustaining life for the past billion or so years suggests to me that you might be worrying a little too much.
But, surely the fact that there is still life on earth despite previous huge changes in our atmospheric composition suggests that the Venus effect is a bad science fiction scenario rather than a serious possibility.
I’m not really taking sides here, but pointing out that for the underlying support of the “don’t worry” argumentation here to be effective, IMHO, the premise must be that the previous 5 extinctions were caused by sentient beings that knew and debated the consequences of their actions, thus everything turned out OK ’cause there’s still sentient beings around to choose their fate. My sniff test for this argument causes my head to jerk back and my nose to scrunch up.
Anyway, the post topic was about Hockey Sticks and debating attribution, and the comments got around to noticing that debating consequences is more fruitful than attribution. I agree.
Fer chrissake, getting bogged down in the attribution is a subset of the larger debate on action. Let us keep that in mind when we calculate our personal energy expenditures.
I would like to extend congratulations from a general reader who has followed this blog, and the evolution of climate science for a few years. The success of the Real Climate group in the brief life of the blog has been steady and swift. Now the blog has become authoritive, while it continues to offer an absolutely fascinating and truly educational read, delivered with style. One pictures the coach full of the members of the Pickwick Club careening down the road towards a noisy dinner with lots of good talk afterward. And now here is the Wall Street Journal giving prominence to two more supporting accounts in the historical review of climate summed up, figuratively,by the Hockey Stick. And is that the dean of the lot, von Storch, offering what looks like a substantial climb down from a rather rude comment only a year ago. And who do we see behind the parade carrying broom and dustpan. Can it be Don Quixote McKitrick and Sancho McIntyre. Clearly the burden of the work has not proven excessive, so I hope one can look forward to a lot more of the same education in science the general reader receives from the scholars who contribute to Real Climate. Even those who oppose you have enriched the blog, and it just does get any better than that.
Steve, if the laws of thermodynamics still hold, then a net extra loss of 3 W/m2 (top of atmosphere) should give more cooling (or less warming, dependent of the absolute values). In the same period, the direct increase of GHGs has given some 0.7 W/m2 more heat retention (at the surface). The difference may be a result of (observed) faster air circulation over the tropics. The more recent papers touch on related topics, but it is not clear to me (yet) where the differences are, as e.g the last paper substracts signals of different satellites, while the first uses only one signal. Need a few days more to find it out…
The arithmetic mean (or average) of a group of n numbers is just their sum divided by n. So the arithmetic mean of 6, 10, and 23 is 39/3 = 13.
The geometric mean of a group of n numbers is the nth root of the product of those numbers. So the geometric mean of 5, 45, and 120 is the cube root of 5â�¢45â�¢120 or 30.
hi william: I got the above answer to my question about arithmetic and geometric means–equivalent to yours, but you have to admit, the above is a lot simpler for “dummies” (aka most of the population). 39/3 = 13; sum of x1,x2,xn (1/n).
best imo is using both types of examples.
as I mentioned, I am an english professor (critical theory, am. lit and advanced comp) teaching at an hbcu (historically black college or univ); I am teaching a comp course where all our readings are on global warming. this course has to educate–and I purposely picked subject matter on which I am pretty much like they are, though with better research habits.
most of them are afraid of science and know little math. I suspect the american population isn’t much different.
this issue, like many others, is too important to be left to the experts and too important for the experts to be left out. so the experts have an enormous responsibility, especially given the declining state of science ed and ed in general in public schools. but also given the disciplinary division of labor.
don’t you guys want at some point students from high schools and universities to be able to ask questions of the bloggers and get answers they really understand?
at my university, we are trying to set up a program for first two years of college that is highly interdisciplinary and focused on integrating science education into curriculum as whole. writing courses are great for this and it would be good in such courses–those housed in humanities– to know enough to be able to do more than analyze the rhetorics of science in texts and films of pop culture (lone heroic scientist in day after tomorrow; lone heroic scientist in state of fear + naive liberal lawyers).
I realize the whole issue of audience is very difficult to solve.
and great job.
[Response: Audience: fair point. I would say, that we are writing for people who won't be too scared by 1/n(x1+...+xn). That means that someone else will have to translate for people who prefer "the arithmetic mean (or average) of a group of n numbers is just their sum divided by n. So the arithmetic mean of 6, 10, and 23 is 39/3 = 13". I agree with you: there will indeed be people who prefer the latter. Maybe you will have to do the translation! - William]
[moderator: some argumentative language has been removed. lets watch the ad hominem please. We don't want flame wars on this site!]
And for the millions who will die as AGW gets worse, and perhaps the billions who may die IF runaway GW kicks in & leads to the 6th mass extinction. I think we must not let this happen. Failure is not an option.
And it’s not just millions or billions I’m concerned about. Each single life is precious, and it’s not good to be in the business of killing people. That’s my main point.
I do agree that Earth is not Venus – some scientists have already told me how much they hate the label “Venus effect,” but I find it informative, simply because it gives some idea about the runaway global warming that did happen 5 times on Earth (which later, obviously, stabilized back to livable conditions).
[moderator: some argumentative language has been removed.]
You probably didn’t have the wonderful 10th grade education in ecology that I had, so I can’t blame you if you fail to understand that all life is connected (along with the inorganic world), and that killing off non-human species could have negative effects on humans – esp. capstone species, food crops, livestock…
I also don’t like false dilemmas like “it’s either them animals or us people.” I prefer to have my cake & eat it too. Now get busy finding out how we can save the earth (incl. the polar bears) and still live somewhat high on the hog. I want solutions, not “it’s impossible” rhetoric.
And thanks to the scientists for bringing the problem of AGW & the hockey stick to light, so we can be forewarned and do something about it.
Comment by Lynn Vincentnathan — 27 Oct 2005 @ 12:05 PM
The global mean temperature is trivially easily defined, and is usually taken to be an arithmetic mean.
You could make a case that the physically important quantity is not temperature, but its fourth power. However, although this would weigh tropical regions more than polar ones, I think it’s fair to say that the global mean surface value of T^4 is also increasing far beyond background variability.
I can’t imagine why you would want a geometric mean, but to do that on a continuous field you would just get the antilogarithm of the average logarithm. Again it is surely increasing above background variability.
If someone can come up with a definition of temperature that has some objective justification and under which definition it isn’t rapidly climbing, that would be interesting, but of course no one has done such a thing. The best you could do would be to say something like “average global temperature is mostly the average temperature over the North Atlantic”, which is hardly global, is it?
So this “average temperature” complaint is just another red herring.
The world is warming up, pretty much as was predicted in 1990 or so. We are more confident than ever that the globe will warm up more, and faster, for a few decades beyond the time when serious changes in anthropogenic net emissions start. After that the prediction problem gets harder, but the more we rock the boat, the bigger the risk into the far future.
Sorry for my hasty (worst deleted) response. I agree with nearly everything written in #27. I have too often heard people misconstrue environmentalism as mainly to do with saving the polar-bears, or baby seals, etc. A red flag went up. It’s usually the anti-abortionists, who then go on to say environmentalists don’t care about human life. Or the economically concerned, who fear environmental actions will harm economic sufficiency (jobs) or wealth — as if the environment were totally unnecessary to human life & well-being.
There are, of course, many branches of environmentalism from conservative “not-in-my-backyard” to radical “people are a disease on the planet.” (Contrarians will sometimes pick the latter with which to smear all – so the label “environmentalist” has become pejorative nowadays.) Most environmentalists I know are between those extremes — concerned about people (their lives & well-being), as well as the non-human world. Most have some ecological understanding.
AGW should be of concern to nearly all types of environmentalists, and to non-environmentalists concerned about life and health, economic sufficiency, wealth, national security, and maintenance of human freedoms. It just seems to me that many of those folks (contrarians) don’t really understand how the natural + sociocultural/psychological world works and that their best interests would be served by abating, rather than arguing the details of, AGW. Of course, we need bonafide science skeptics to keep science honest; it’s the contrarians, whose motives seem to be to thwart environmental actions (and environmental science), that concern me.
Perhaps we need a lot more science to continue making the claim for AGW more and more robust, simply because many people aren’t taking AGW seriously (though I personally don’t need so much science to act, esp when those actions save money).
I wonder what it will take to get people to understand that AGW is very likely happening, that the threats may be dangerous to very dangerous (& we’re most likely already seeing the harms), and there’s even a possibility of runaway GW, that could be much more serious. I wonder if any amount of science will convince the contrarians. This is a much tougher nut to crack that I had thought 15 years ago.
BTW, I think the mall may not be bad environmentally speaking – most have public transport to them, or at least you can buy nearly everything you need there, without driving all over the place. It could be built using “green” architecture & alternative energy sources. And if you get lured into a mall (& do mainly window-shopping, like we do), then you won’t be driving around just for fun, but walking around – good for the health, too.
Comment by Lynn Vincentnathan — 27 Oct 2005 @ 5:10 PM
Can you please direct me to Jone’s data so that I can aquaint myself with the facts?
[Response: See the link provided to the Jones and Mann review paper. -mike.]
Where can I find this link? I have lookede on the site, but I have not been able to find it.
[Response:The link is provided in comment #23 of this post. -mike]
Let me pick some nits with Michael Tobis’ proposal. An arithmetic average for temperatures defined over some region of the atmosphere, or any homogeneous system, is not an arbitrary choice. Remember that for a gas under atmospheric conditions, temperature and energy are linearly related**. Thus, if you want to talk about Total Energy in a volume of air = average energy x unit volume, you can use = Temperature x 5/2 RT x volume.
Since you can trivially define the total energy in the atmosphere, or in some portion of it (say 1 meter from the surface), and the volume in that layer, you can find the total and average energies thus, you can sensibly define the average temperature.
This is much simpler than what Michael proposes as an alternative, if for no other reason that the emissivity is balanced by the absorption, which is not proportional to T^4. Further, Michael’s proposal REQUIRES the use of absolute temperature, something that McKitrick DID NOT DO. My points were that a. If you use the Kelvin scale, there will not be much difference in the result averaged any which way and b. an arithmetic average has a simple justification based on thermodynamics.
It ain’t rocket science folk, that is a two semester course.
*less water vapor, but the contribution to the total energy of the atmosphere is small. Note that this is NOT the case for energy transport. Two different things.
von Storch and Zorita did not replicate the MBH98 methodology in key respects. In particular, their paper indicates that they did principal components on the correlation matrix of short-centered data, whereas MBH98 did singular value decomposition (SVD) on the short-centered data matrix itself. There’s a big difference. The VZ procedure simply does not generate hockey stick shaped PC1s in the way that the MBH98 procedure does and cannot be used to test the impact of the MBH98 methodology.
[Response:Are you sure that they did not estimate the eigenvectors/values, rather than applying an SVD to the correlation matrix? Usually, PCA involves either applying an SVD to a matrix of the data or an eigenvalue analysis to the co-variance/correlation matrix. To apply SVD to a correlation matrix sounds a bit odd to me - but then I must admit I haven't read the paper by VZ. I do, however, believe that von Storch is very competent when it comes to statistics and linear algebra, so if VZ did so (did they?), then there probably is a good reason for it... - rasmus]
Secondly, VZ endowed their pseudoproxies with much stronger correlations to gridcell temperature than exist in the controversial North American tree ring network. They assumed a minimum correlation of 0.3, whereas sites in the 15th century MBH98 tree ring network has an average correlation to gridcell temperature of -0.08 (with relative strong average correlation to precipitation.)
Thirdly the VZ minimum correlation assumption effectively excluded the study of the potential impact of proxies affected by nonclimatic trends (such as arguably, CO2 fertilization of bristlecones).
The MBH98 PC methodology is actually not a “principal components” methodology as defined by Preisendorfer , which requires the use of time-centered data and explicitly excludes de-centered data (page 26).
VZ did not analyze the impact of the MBH98 method on MBH98 proxies and, since their replication of MBH98 methods was flawed, does not show that problems with MBH98 PC methodology did not matter.
For the past ten years, I have been working in SPC (“Statistical Process Control”) for a major telecommunications company. (And, let me state the disclaimer that my “opinions” are my own and NOT those of my employer!) Beginning in the Year 2000, I have regularly created some time series X-Bar charts using the following datasets:
A simple regression on both datasets yields a p-value < 0.0005 over time, indicating that global temperatures are increasing with respect to time. The same observation would also be true for CO2 concentrations in the atmosphere. Together, a basic Pearson correlation coefficient yields statistically significant results, indicating that CO2 emissions are having a real statistical association with the increase in global temperatures. Granted, this is the “correlation versus causation” issue, but, IMHO, the simplest explanation would be that increasing CO2 concentrations are causing the increase in global temperatures!
Given the strong association between increasing CO2 concentrations and increasing global temperatures, how can any scientist embrace the “null hypothesis” and say that CO2 emissions are NOT correlated with the increase in global temperatures? Is not it generally accepted within scientific research that one cannot â??proveâ?? the null hypothesis? Or, am I “missing something” here??
Comment by Donald E. Flood — 28 Oct 2005 @ 12:40 PM
Real Climate is political commentary. I was pointing out that enviromentalism and science are not a priori connected. And that the Real Climate website is an enviromentalist website run by scientists. Otherwise there is no reason for it to exist.
[Response:I disagree. We started this because there was (is) a desire out 'there' for scientists to explain what it is they think they know and why they think they know it. We are not an 'environmentalist' site and we do not advocate specific policies. We talk about the science, and any perception that you have that this is political is false. -gavin]
Is the hockey stick receiving more attention in scientific literature because of politically motivated people (M&M) questioning it? If there were no political implications of the science and M&M did not cast doubt, would the hockey stick discussion have stopped in the scientific literature and the discussion moved on to other questions?
Re # 8 and the other comments related to #8 its again about the political implications of the science. The difference between environmentalists and scientists is that the environmental groups are the main political advocates for regulatory action. Environmentalists are using the information that scientists find to argue that the science shows that global warming laws should be passed.
Environmentalists, particular in the US, have been the driving force in getting environmental regulations enacted. This political success has really ticked off political conservatives and their allies in affected industries. They have gone to great efforts to malign environmentalists. The usual tactic is to falsely equate environmentalists, environmentalism and their political positions with extremism. Claiming that environmentalism is like Christianity unfortunately feeds into this.
The information scientists provided, when objectively examined, supported environmentalists call for action. Realizing this conservatives/industry started to malign the scientific community. A common tactic is to conflate environmentalists and scientists: environmentalism = extremism, and environmentalist = scientist ergo science = extremism.
RE #44, I agree with nearly all you wrote. Except to point out that there are many types of environmentalists. Going beyond the conservative/extremist continuum (mentioned in #39), there is a Great Tradition/Little Tradition continuum. The national & international env orgs deal with national/international policy, laws, and cases against polluters. But the vast majority of environmentalists are the ordinary folks back home who recycle (there is a GHG component to that in the energy saved using recycled materials, esp aluminum), conserve energy, become efficient, use alt energy when available…
Of course, many of these folks also send their annual dues into the national/international orgs – so they are acting both locally & globally. This is the important point: We wouldn’t need environmental laws & regs, if everyone would act right & do the environmentally prudent things. I still have hope that the bulk of people will wake up to env problems & start solving them in their daily lives. The more that happens, the less laws we need.
But what made me an environmentalist? It was those Sunday school values I learned as a kid. So, religion is very much a part of environmentalism. I guess atheists into doing good & saving the earth have picked up some quasi-religious values along the way.
Sure glad my mom taught me to be a nonconformist & not follow the sheep over the cliff – otherwise I’d be ashamed to be an environmentalist. But I’m proud of it. I only ask that contrarians not throw too much rotten food at me. Why not try composting it instead.
Comment by Lynn Vincentnathan — 29 Oct 2005 @ 11:22 AM
Gavin, you note that “We talk about the science, and any perception that you have that this is political is false.”
Steve McIntyre has posted that he submitted to this thread what seems to be a completely scientific and non-political posting on this topic (see quoted post beginning “von Storch and Zorita…” at http://www.climateaudit.org/?p=419). It doesn’t seem to have been allowed through to the blog here yet. Is it still “in processing?”
[Response: Well, what Gavin said seems true to me. As for the CA post... there seems a determined attempt to personalise this by some people which I think is regrettable. The comment policy on this site is written by us all - William]
Comment by Armand MacMurray — 29 Oct 2005 @ 1:04 PM
comment i found at climate audit
“von Storch and Zorita did not replicate the MBH98 methodology in key respects. In particular, their paper indicates that they did principal components on the correlation matrix of short-centered data, where as MBH98 did singular value decomposition (SVD) on the short-centered data matrix itself. The VZ procedure simply does not generate hockey stick shaped PC1s and cannot be used to test the impact of the MBH98 methodology.
Secondly, VZ endowed their pseudoproxies with much stronger correlations to gridcell temperature than exist in the controversial North American tree ring network. They assumed a minimum correlation of 0.3, whereas the 15th century MBH98 tree ring network has an average correlation to gridcell temperature of ~0.08 (with relative strong average correlation to precipitation.)
[Response: I read this. It made no sense to me. The 15th C temperature is *reconstructed* from, amongst other things, tree rings. The correlation to the true temperature is unknown - William]
Thirdly the VZ minimum correlation assumption effectively excluded the study of the potential impact of proxies affected by nonclimatic trends (such as arguably, CO2 fertilization of bristlecones).
The MBH98 PC methodology is actually not a “principal components” methodology as defined by Preisendorfer , which requires the use of time-centered data and explicitly excludes de-centered data (page 26).
[Response: So what? - William]
VZ did not analyze the impact of the MBH98 method on MBH98 proxies and, since their replication of MBH98 methods was flawed, does not show that problems with MBH98 PC methodology did not matter.”
[Response: I have also read a number of posts on climateaudit.org, and I think that a large fraction of what it has to say is mumbo-jumbo. Take for instance the statement that since they used an annual mean value, they should use discrete wavelet anaysis (post on Moberg's work). This doesn't make sense. Furthermore, it has a go at the iid-test, but without making any point - just insinuations. To my mind, one of the classic examples of how they twist the logic can be found in Are Temperature Trends affected by Economic Activity?. -rasmus]
Comment by Armand MacMurray — 29 Oct 2005 @ 4:15 PM
William, you say that “As for the CA post… there seems a determined attempt to personalise this by some people which I think is regrettable.”
As I read it, this suggests that the Steve McIntyre post I referred to was disallowed because of the author’s identity, even though it was a scientific, and not political, post. If so, it would seem useful to add that criterion to your posted comments policy. If not, could you please clarify why it was disallowed?
[Response: All comments pass through a filter. Most pass through immediately, some are caught for later assessment. This is stated plainly when comments are made. Approval depends on people seeing the comments and deciding to let them through, commenting on them or disallowing them for various reasons (as stated in the comment policy). Sometimes this takes time if people are busy or a response is deemed neccesary, thus comments sometimes appear 'out of order', particularly at weekends or when we are busy. This is unfortunate but can't be helped. -gavin]
Comment by Armand MacMurray — 30 Oct 2005 @ 5:28 PM
RE #44 on politics. Seems there’s a fear that human freedoms will be curtailed if we address GW. We don’t like people dictating how much we can drive, or what car we can own. I’m with you on that one, Sanderson (if that’s one of your peeves); I’d like to buy a plug-in hybrid so I can drive almost entirely on wind power (& save $$$ & the earth), but someone with power has decided only to make them available in Europe and Japan, and not in the U.S.
I’m thinking that if people don’t address GW, and we don’t have some regs now (as an inoculation against future harm), then there may come a time when things get really bad that will either (1) lead to chaos — every man for himself in a disintegrating society & skip the women & children — or (2) extreme regulations & curtailment of freedoms, so as to distribute life-sustaining resources more equitably in a starving world. Or maybe (3) just war and more and faster destruction of the world, because people don’t really understand why they’re suffering & just lash out. The longer we postpone acting to reduce GHGs either on our own accord or by a few light regulations now, the worse the environmental/economic/political situation may become as GW cuts deeper and deeper into our subsistence base.
There is even a better way to look at it. Necessity is the mother of invention. In the past natural obstacles/blockages have led to marvelous breakthroughs. Now artificial blockages in small doses (regulations) are leading to breakthroughs. Businesses having to meet environmental regs actually come up with solutions that save them more money & make them more efficient. For insights, read NATURAL CAPITALISM by Amory Lovins (NatCap.org), or look into 3M’s 3P program (Pollution Prevention Pays).
My idea is to let the Hockey Stick push us & $$-saving carrots pull us to a more efficient & conservative, lower GHG emitting world now, than risk paying the GW piper later. We only have money (& perhaps our subsistence base) to lose by not lowering GHGs in smart ways.
About the political (power) dimension, which is only analytically, not concretely separable from our total human-in-the environment condition. We’ve all heard the dictum that “knowledge is power.” Michel Foucault (social theorist) came up with “power is knowledge.” If you read Chris Mooney’s THE REPUBLICAN WAR ON SCIENCE (he also shows how Democrats twist things, too), or even the earlier entries on this site about Barton and Inhofe, you’ll understand that the powerful people (in government, media, industry) are the ones who hold the “knowledge strings” for Americans.
Hurrah for the internet. We can wade through it and find knowledge that makes sense (like on this site), without government or indy-tied media creating a false reality for us.
Comment by Lynn Vincentnathan — 30 Oct 2005 @ 7:52 PM
William, the “so what” is that Mann did not explain how he performed his statistics adequately in his description of methods in his paper. Also, the so what is that as he is using an unapproved method, it is incumbent on him to check that his method does not produce artifacts.
[Response: We showed ages ago that the normalisation doesn't make any difference even in the real case, and that VZ shows a similar result is not surprising - gavin]
A. Your comment does not address my point that it is encumbent on experimenters to properly describe their procedures in the literature so that readers can evaluate the implication. This goes double for non-standard procedures.
B. I read your cited RC rationale for how you “disproved” the effect of abnormal normalization:
1. The shown mean is a mean of tree series. It has no weighting for area. That’s like sampling 100 Democrats and 5 Republicans and basing your guess on the outcome of a presidential election on that survey. your sampling method is skewed. You need to do a fair sample or you need to weight by area (party, in the analogy).
2. Steve does not agree with your arguments for several reasons (and you have not engaged on those).
3. Even if for some reason the “offcenter PCA method” worked with the particular, and I’m not acknowledging that it does, if it is a flawed method for general cases, you need to show how your case does not have those flaws. Or better yet, just use a normal area-weighted mean.
4. Steve has proven that the method can data mine for hockey sticks out of red noise. That it magnifies the effect of hockeystick signal. Given that, why use such a funky method? Better yet, completely open question: why was that offcenter method picked, vice conventional Preisendorfer methods?
C. Finally, this matter is still much in debate. Steve had peer-review accepted replies to the comments. You should read them and evaluate the suitability of his logic, points in engaging on this topic. You gotta read both sides…
[Response: One could go on debating this point ad nasueum, and despite the fact that it has been shown not to be important for the final reconstruction, there are apparently always new reasons why we have to keep revisiting it. Throw it out completely, it still makes no difference! So in terms of possible benefit compared to the costs, it does not seem worthwhile to continue. Instead, the scientists involved (which doesn't really include me) move on to testing new methods, incorporating more data and trying to reduce the error bars. This whole debate on the technicalities that don't matter is just a waste of time. There are much more interesting things to do. This field is not quantum mechanics or pure mathematics where there is a 'right' way to do it and everything else is wrong, there are only useful or not-so-useful approaches, and you just want the answer not to have to depend on the (relatively arbitrary) details. In this case it doesn't, so why continue? -gavin]
Please, let’s continue. You say that debate is welcomed in the policy for the blog. Don’t slam the door shut on the primary issue of controversy around. Let’s dig into the issue and the subissues.
You say that you’ve proven something. Then you say it doesn’t matter. Surely if you’ve proven it, it’s irrelevant if it doesn’t matter. Also if a technicality is wrong, you should acknowledge that (regardless of if you think it’s effect is minor).
[Response: Try reading what I said before; It's demonstrably irrelevant therefore it doesn't matter. 'Debate' over. Of course, there are historical precedents for longwinded irrelevant debates, 'counting angels on the heads of pins' for instance, but excuse me if I have better things to do. -gavin]
[Response: Absent a public apology regarding your remarks about my ethics, I will not be drawn into a personal discussion with you. Discussion regarding upcoming papers is best left to after they have appeared. -gavin]
Steve’s remarks were in the form of a question, not a statement. The question was based on a perceived difference between the implementation of RC’s postings policy and the stated RC posting policy.
You stated a reason for this perceived discrepancy which certainly makes sense. The filters catch certain posts and they must be reviewed. This takes time and causes posting delays. Based on the appearance of a number of posts on this thread, it appears that this delay has a finite duration. This certainly lends credence to your explanation.
What you redacted in #56 was scientific and on topic.
Steve (#42), I think what you’re saying in your works is that we have not broken through the “noise barrier” re GW. If that’s a somewhat correct assessment of your idea, then do you think we will break through in the future — say, in 50 or 100 years?
Comment by Lynn Vincentnathan — 31 Oct 2005 @ 11:30 AM
This depends on:
What el nino will do
What volcanoes will do
What the sun will do
Which emission scenario will materialise
How the sinks respond to the mission
How the temperature responds to CO2
How the temperature responds to aerosols
Summary: we don’t know with sufficient accuracy yet.
It strikes me that the two papers (Zorita and von Storch-ZvS) and Huybers (H), treat two different microissues of McIntyre and McKitrick’s (MM) criticisms and also treat the macro criticism of Mann, Bradley and Hughes’ (MBH) implementation of principle component analysis. These arguments are being randomly packed and unpacked here sowing confusion.
It has been shown that the MBH implementation of PCA works, although it is not optimal, but the significance of those problems with the particular data sets are insignificant (see H and ZvS for comments on the PCA of specific data sets which have been challenged). The implementation MBH used appears to be widely accepted in the paleoclimate community. If one wished to make a contribution other than beating a political drum, you might look at why this non-Preisendorfer implementation was adopted. In particular, it would be useful to ask on what sort of data sets, if any, it provides a superior analysis, and on what sort of data sets it fails, and how badly it fails. Further, one could explore if there were superior implementations. To use a Preisendorfer 88 is not by itself an interesting factum to me. ZvS and H try to answer this question.
H carries out part of this program, but his paper is lacking as it only treats a single data set. More convincing would have been construction of multiple arbitrary data sets showing that his implementation was superior on all of them, or outlining where it worked best, and where it was not an improvement. Along the way he uncovers some issues in the MMo5 analysis.
Allow me to pause here for some cultural background for lurkers. As you have seen, the scientific literature is built to publish reanalysis and criticisms of published results. Typically this is done in further publications which start: We X have looked into Y, which Z first published on, and found results which show exactly the opposite. Z typically does not get to reply to that except in a few journals such as Science/Nature which have short letter columns. What Z does is further research published the next article which goes: We did Y again and all criticisms made in this other strange paper are wrong because …..
After a while, a consensus forms on what the correct answer is, and the field moves on, although you still see X and Z going at it, but the consensus is accepted. Once this forms, X or Z will start getting referees reports saying hey, you got it wrong, move on, although they may find a friendly editor somewhere.
It is more serious when a journal publishes a comment on a particular paper. In that case Z gets to respond. I have seen very few cases where Z says X was right we were wrong. I have seen all sorts of strange justifications, and generally speaking everyone goes, yeah, yeah and figures that Z got it wrong. (We scientists are very polite). So the moral of the story is, when you read a criticism that is clear and a reply to that which is murky, go yeah, yeah and move on.
However, H did provide a useful parsing of the MM05 and MBH98 results with respect to the North American Tree Ring series. He explicitly stayed away from the question, can the series be used because of anamolous growth in the 20th century. This is really one for the dendrology community which has been curiously quiet on the issue and I would greatly appreciate some postings on this. My suspicion is that this is not the issue that MM make of it for various reasons, starting with the fact that Malcolm Hughes is an expert in the area and the analysis in MBH99. To summerize H’s result, his implementation is superior on the NoAm series, MM05 is next best, but not so different from MBH98 that it makes a difference.
ZvS generate a data set and use it to come to about the same conclusion wrt the various implementations of PCA.
I think that Gavin has it about right when he talks about this degenerating into a debate about very, very useful, vs. very useful. I take his point about math, but believe me there is similar noise in discussions about quantum mechanics. Then, of course there are the folks who simply don’t believe in quantum mechanics. http://www.crank.net/physics.html
It has been shown that the MBH implementation of PCA works, although it is not optimal….
Of course it works if your goal is to individually weight a specific PC (or PCs) so that one or several PCs dominate. The reason one would use PCA is to give more weighting to certain proxies which are better than the rest. The result of such weighting is inconsequential to the overall reconstruction if the behavior of highly weighted PCs are basically similar to most of the other PCs. However, if one or several highly weighted PCs are distinctly different from most of the others, then the resulting reconstruction will be most similar to the highly weighted PCs. The impact on the reconstruction under these conditions is certainly not inconsequential.
[Response: Ummm... this is getting just a bit boring. Fairly soon now we'll decide that going round the same old circles again and again is too dull. Anyway: as the von S stuff shows, the actual answer isn't terribly sensitive to how you do the PCA - William]
Re: #60, “What el nino will do
What volcanoes will do
What the sun will do
Which emission scenario will materialise
How the sinks respond to the mission
How the temperature responds to CO2
How the temperature responds to aerosols”
These are all taken into account in the models! The results are given in the IPCC reports!
Re; #63 Yes, but not perfectly, and they contribute to the noise, which is why a formal detection and attribution analysis is a complicated thing to do. The detection and attribution ‘community’ appear to have concluded that the signal has now emerged from this noise, however.
I’d be interested to hear on what basis people felt there was too much noise and how much more warming would have to be observed for them to conclude that the signal was greater than the noise.
In terms of forcing uncertainties I understand that uncertainties in aerosol forcing are the largest factor when analysing the climate of the 20th century [and trying to use observations of changes in global mean temperature up to now to constrain what the future response to future increases in CO2 would be - over longer timescales the effect of aerosols gets much smaller].
However, I don’t see how this uncertainty can lead to uncertainty over the effect of GHGs.
Re #59: Lynn, I have not expressed any views about whether or not we have broken through a “noise barrier”. In the papers in controversy, I argued that no conclusions can be drawn as to 20th century uniqueness based on MBH98 for a variety of reasons, including their flawed PC method, the interaction between their PC method and flawed proxies (bristlecones) and the failure of the reconstruction to pass statistical cross-validation tests. I am dubious about other multiproxy papers as well, but have not published on them to date.
The answer about the uncertainty of aerosols is contained in the first graph of the RealClimate discussion about their influence.
GHGs influence and aerosol sensitivity are in lockstep. If the current influence of aerosols is zero, then the sensitivity of GHGs is ~1 C to match the past temperature trend (and solar need to be increased too). If the current influence is -1.5 W/m2, then the sensitivity for GHGs increases to 6 C…
Further, have a look at comment #14 at the same page.
Btw, the largest uncertainty of current models is in the cloud feedback.
re #67. Thanks for your reply, Steve. I guess I’m not so interested in the past as in the future, even though we need to understand the past to help us understand the future.
What do you think about the future? If the 20th c. did not show any special signal of AGW vis-a-vis the more distant past (at least in regard the papers, data, and methods you critique), do you think in the next 100 years there will more likely than not be an AGW signal (i.e., will global warming become evident, at least to some extent)? What’s your best guess?
Comment by Lynn Vincentnathan — 31 Oct 2005 @ 4:54 PM
IMHO the strong alleged cooling influence of aerosols is a haunting legacy of Schneider…
Gavin, please unedit #56, you did have your apology, and it helps the discussion of the b-word.
Lynn, there is an essential point in looking at the past. Climate models need to correctly “predict” the past, to have any projective power for what may happen in the future. Therefore, it is crucial to know past climate as accurate as possible. If the past was as found by MBH98/99, then there was little natural variation, models give GHG/aerosol a high sensitivity and are predicting a warm future. If the past was more Moberg-like (with more natural variation, thus higher solar/volcanic sensitivity), then models will need to adjust to a lower GHG/aerosol sensitivity and the future will be at the lower side of the projections…
Anyway, within some 10 years from now, the real trend in climate will make it clear if the current warm period is mainly part of a natural variation or mainly GHG based, or a mix of both…
[Response:This is an incorrect premise. Earlier periods were not much affected by aerosols, but solar forcing, which is similarly uncertain. Any reasonable temperature variability falls within the results based on the current sensitivity combined with different values of solar forcing - thus these periods are not terribly useful in constraining climate sensitivity. Your second point was made (with slightly more validity) 10 years ago. The subsequent data has all fallen on the side of dominant GHGs... -gavin]
High climate sensitivity is always linked with speculative anthropogenic aerosol cooling.
Objective Estimation of the Probability Distribution for Climate Sensitivity; Andronova, N., and M. E. Schlesinger. 2001., J. Geophys. Res., 106, D19, 22,605-22,612
[Response: Only if you think that climate sensitivity is only determinable from the 20th century changes. It turns out that this does not provide much of a constraint because of the uncertainties in aerosols, solar, ozone etc. Better constraints come from the paleo-data. See the previous post on climate sensitivity and aerosols to see why the association you are promoting doesn't work. And by the way, the number you quote for the Dutch model is the transient climate response, not the equilibrium sensitivity as you well know (since I pointed it out before). -gavin]
RE # 73, Gavin’s response: What is the estimated hottest it got during the 5 runaway GW extinction events in the past (e.g., end-Permian), and what level of super-anoxia was there – oxygen depletion due to O2 reacting with that massive amount of CH4 from melting clathrates, producing CO2?
[Response: This is a new one on me... were there really 5 runaway GW extinction events? It sounds rather doubtful - William]
I’d like to know what could happen (highest warming, anoxia) within the range of upper end possibility in what might turn out to be the 6th mass extinction event (if we don’t start greatly reducing GHGs). I understand Earth can’t become like Venus, and that GW would eventually plateau & the climate come back viability, but what would the upper limit (constraint) be, if everything that could go wrong went wrong (re all the warming forces). I understand it’s pretty speculative and guesstimation.
I’ve been talking about this to my students, but (another question) if the worst were to happen, when could the anoxia start kicking in (I’ve been telling them hundreds of years from now, but I haven’t the faintest).
[Response: Ah... looking at that, I say that (1) it doesn't mean runaway (in the Venus sense) and (2) it looks like very early work - not something to rely on yet - William]
Comment by Lynn Vincentnathan — 1 Nov 2005 @ 11:15 AM
Re the response in #74, by “runaway” I’m referring to positive feedbacks outweighing negative feedbacks, at least until some limit is reached. Initial warming (from whatever cause) triggers increased GHG emissions (or some more warming, e.g. from low albedo), which lead to further warming, which leads to further GHG emission, and so on until it gets a lot hotter than the initial warming from the initial forcing. “Runaway,” I suppose, is an anthropocentric term. I think of human GHG emissions (which are mostly under our control) causing initial increased warming; this warming then causes further GHG emissions from nature (which we people have no control over, hence “runaway”), which causes further warming, and so on, until it gets to some constraint or limit, beyond which it would be impossible for Earth’s climate to go, given the various variables involved (which are unlike the variables for Venus).
It was that limit I was asking about. What would that limit of warming be? How hot could earth’s climate get if all possible factors conspired together to cause warming….
Comment by Lynn Vincentnathan — 1 Nov 2005 @ 1:15 PM
Re: comment to 73
Only if you think that climate sensitivity is only determinable from the 20th century changes. It turns out that this does not provide much of a constraint because of the uncertainties in aerosols, solar, ozone etc. Better constraints come from the paleo-data. See the previous post on climate sensitivity and aerosols to see why the association you are promoting doesn’t work.
This is quite astounding. If I weren’t sure of the opposite I’d suspect that you’re a GW skeptic. So, the well-documented evidence of the fastest ever GW coincidental with CO2 concentration growth is not much of constraint because of other uncertainties. OK, suppose so. But how could paleo-data provide better constraint? How about uncertainties in paleo-data? How about sparcity of paleo-data? What do we really know about insolation in the distant past? My guess would be that, given the error bars, paleo data shouldn’t provide better constraints.
[Response:Read the previous posts (here, here, and here) on this subject. The paleo-data I refer to is for the last glacial period. -gavin]
“von Storch and Zorita did not replicate the MBH98 methodology in key respects. In particular, their paper indicates that they did principal components on the correlation matrix of short-centered data, whereas MBH98 did singular value decomposition (SVD) on the short-centered data matrix itself. There’s a big difference. The VZ procedure simply does not generate hockey stick shaped PC1s in the way that the MBH98 procedure does and cannot be used to test the impact of the MBH98 methodology.”
Does your silence imply that you accept that VZ cannot be used to test the impact of the MBH98 methodology?
[Response: Since the VZ procedure manifestly *does* produce a hockey stick shaped result, the comment is incomprehensible - William]
Comment by nanny_govt_sucks — 2 Nov 2005 @ 12:21 AM
Re: 74, 75
What you are effectively looking at here is something along the lines of the mid-late Cretaceous period, with thermohaline driven ocean circulation at a much higher temperature (and more sluggish). The evidence suggests that an equatorial band became effectively uninhabitable by anything bigger than bacteria. Temperatures were around 6-7K hotter than today, on average, with a much flatter longitudinal temperature gradient.
A change to such conditions is, needless to say, extremely unlikely. Today’s ocean circulation patterns – in particular the isnlation of antartica – effectively refrigerate the ocean, and although this process could concieveably weaken with large scale, long term AGW, it cannot competely fail; it’s driven by geography. Now, if someone decides to build a dam from the tip of S America to the Antartic penninsular, and from Scotland to Iceland at the same time, then things would change. As far as the idea of methane releases changing Atmospheric O2 concentrations, the amounts required simply don’t exist; methane amounting to several percent of the volume of the atmosphere would have to be released. The idea of anoxic events causing kills is more to do with local breakdowns in shallow water circulation, possible due to higher temperatures. The idea is quite speculative; and always bear in mind that shallow water life is massively overrepresented in the fossil record – a single, local fish kill event can easily generate more fossils than a million years of life on land.
The Persian gulf, Eastern Med, and Red sea (to name a few) are all ‘anoxic event’ candidates; a freakishly hot year could make them all do this at once, which would look like a big catastrophe in the fossil record. The black sea, of course, is anoxic already.
So as a summary – a global anoxic ocean will not happen. Local anoxic events may well happen. A 6-7K rise is probably the limit even if we burn the planet, although polar rises would be double this. Positive feedbacks are ultimately limited – once all the Arctic ice melts, for instance, the feedback stops, and ditto for Antartica in extremis. Methane has a short residence time, meaning your emissions have to increase with temperature to give a positive feedback; hence a one off event will not lead to a runaway.
RE #78, what you wrote makes sense. There are at least 3 factors that would make this era different from the others: 1. the continents have drifted into a configuration that makes such extreme GW & anoxia, and hence such great extinction, much less likely (I think that’s what you’re saying??); 2. I read that life is more resilient now to survive climate change than it was eons ago; and 3. humans, who have invented the things that are contributing to AGW, also have the smarts to reduce the negative side effects of their projects. Unlike having to adapt biologically through natural selection over many many generations, we people can change our culture within a life-time, even within a short time, if we wish. Social (revitalization) movements can happen very quickly to change culture. What is needed for that to happen is the (correct) perception of serious problems that affect or will affect many people, a new vision for a better society (more humanly helpful, less environtmentally harmful), leadership, resource mobilization (funds, communications, etc). I think the internet is a great tool for spreading information.
And, even though you have alleviated my concern re an extreme extinction event, we are still facing plenty of problems & human deaths/harm with regular AGW. So whether it’s billions or millions, or even hundreds of thousands, who will eventually die from AGW, it behooves us to do the best we can to reverse course on this, without (I bow to contrarians’ concerns) shooting ourselves in the foot (loosing our political freedoms & economic good things) in doing so.
Comment by Lynn Vincentnathan — 2 Nov 2005 @ 1:46 PM
Re #74 and 75: Lynn, you seem to be implying that unless negative feedbacks counter the positive feedbacks, we will necessarily get a runaway effect (until such time as the negative feedbacks overpower the positive ones). This is not correct. It is possible to have positive feedbacks that are sufficiently strong to amplify warming but not in a runaway manner.
As a simple example, suppose that each 1 deg increase in temperature causes water vapor to increase by an amount that would then be expected to produce an additional 0.5 deg of temperature increase. One might naively expect that this could lead to a runaway effect since this 0.5 deg temperature increase will then lead to further increase in water vapor and further temperature increase and so on. However, a simple thought experiment shows that this is not so: if GHG’s raise the temperature by 1 deg alone, the water vapor feedback will then kick it up another half a degree, then the feedback of the water vapor on this additional half degree of warming will kick it up another quarter of a degree, and so on. In this example, we have a convergent geometric series, so the water vapor effect amplifies the warming but the temperature does not grow without bound; in fact, in this simple example, the water vapor feedback ends up amplifying the warming due to GHGs alone by a factor of 2.
The above example may seem very contrived but in fact I was able to show that such a geometric series giving amplification is exactly the sort of mathematical result one gets within a very simplistic, back-of-the-envelope model of the CO2 greenhouse effect with H20 feedback. In that model, I assumed only that the change in temperature depends logarithmically on concentration for both CO2 and H20 and that the water vapor responds to increases in temperature in such a way that the relative humidity remains constant.
In that model, you indeed get a geometric series. Whether that series is convergent (leading to amplification) or divergent (leading to a runaway effect) depends on the strength of the response of temperature to changes in H20 concentration. And, of course, if it is convergent, the amount of amplification you get depends again on this strength.
William, perhaps I can help clarify. The key words in nanny’s quote are “…in the way that MBH98 procedure does…”.
The question is NOT “does VZ produce hockey sticks?” (which your answer addresses), but rather “does VZ produce hockey sticks *in the same way that MBH98 does*?” A deeper analysis of VZ’s work is required to answer this question; I would look forward to reading any such analysis done by the contributers here.
Comment by Armand MacMurray — 2 Nov 2005 @ 4:06 PM
Andrew, I’m curious that you see a limit around 6-7 k. Even given that positive feedbacks are utlimately self-limiting, couldn’t continued doubling of CO2 in the atmosphere (assuming, for example that we switch to a coal-based economy) lead, in the long run, to increases that are greater than that?
Gavin, indeed solar is/was the main driver of climate in the past millenium, as GHGs and aerosols probably didn’t vary much. Thus the decrease in temperature during the LIA is mainly solar driven (assuming that volcanoes were not far more active in the LIA as a whole).
[Response:Bad assumption. If you've read any of the studies that have actually been done in modeling the climate of the past 1000 years (e.g. Crowley, 2000, but many others since), you should be aware that volcanic forcing is the primary radiative forcing responsible for "LIA" cooling at the hemispheric or global-mean scale. You can find references to these studies here and in this review paper on "Climate Over Past Millennia" by Jones and Mann (2004). Solar forcing appears to play a more significant role in explaining the spatial patterns of temperature change. See e.g. this review paper (Schmidt et al, 2004), where the response of a climate model to estimated past changes in natural forcing due to solar irradiance variations and explosive volcanic eruptions, is shown to match the spatial pattern of reconstructed temperature changes during the "Little Ice Age" (which includes enhanced cooling in certain regions such as Europe) as well as the smaller hemispheric-mean changes. - Mike]
Looking at the difference in reconstructed temperature between the MBH98 (-0.3 C) LIA and the Moberg (-0.8 C) LIA, or the borehole (-1.0 C) LIA, the influence of solar is two to three times higher, depending of the chosen reconstruction.
A higher solar influence means a lower influence of CO2/aerosols to obtain a good fit to the surface data of the past 1.5 century. In that case, the projection for a CO2 doubling ends at the low(er) side of the IPCC range…
[Response: Again, your reasoning is incorrect because your assumption that solar forcing is the dominant forcing associated with LIA hemispheric-mean cooling is false. Careful, quantitative studies by Hegerl and others using paleoreconstructions and model simulations of the past millennium estimate sensitivites in the mid-range of the IPCC TAR. You can find references to these studies in the review paper by Jones and Mann (2004) linked to above. - Mike]
That aerosols have less influence than calculated can be seen in the temperature trends in Europe: there is no to little measurable difference in trend between non/less polluted areas and places where the models predict the largest change, due to the 56% reduction of SO2 emissions since 1975.
That the GHG trend now is emerging from the background noise seems to be a little premature. Especially as current models don’t reflect the natural variations in heat content of the oceans: compare the curve of Fig. 1 in Levitus with the CO2 and aerosol (SO2 emissions are globally steady since 1975) trends for the same period. Have also a look at the frequency analyses by Barnett: Fig. S1 shows that the models don’t capture any periodic event between 10 and over 50 years. That includes the 11/22 year and longer (60+) years solar and other natural cycles visible in many climate events. As a consequence, any secular trend in e.g. solar (constantly higher now than in the warm 1930-1940 period) may be underestimated too…
The University of Florida has made a summary of the Cretaceous period where they estimated a 4-10 times higher CO2 level than today and a 6(-12) C average warmer global temperature than today, be it mostly toward the poles.
Even with CO2 levels some 25 times higher than today, the global average temperature seems to be limited around 22 C vs. 15 C today. Thus while there were times with far more CO2, it seems that there are negative feedback mechanisms at work which limit the upper level temperature of the earth.
Gavin, I’ve read previous posts and I’m still confused.
Absent paleo data, the uncertainty in aerosol forcing must have increased the uncertainty in the climate sensitivity to CO2. I believe you aknowledged this in the “dimming” thread:
While it is true that, holding everything else equal, an increase in how much cooling was associated with aerosols would lead to an increase in the estimate of climate sensitivity, the error bars are too large for this to be much of a constraint.
I interpret the undelined part as a saying that you cannot say how much worse the sensitivity estimate becomes. Right?
Then you say that using paleo data you can constrain it back to the same 1.5 to 4.5 deg C range. Again, I don’t understand how paleo data can help if you don’t even have a good grip on everything that I mentioned in the previous post? In addition, how do we know that climate sensitivity now and then is the same? Isn’t it a somewhat shaky assumption?
RE #80, I was even thinking of mentioning water vapor positive feedback as part of the “regular” climate package, and not the type of positive feedback I had in mind. I was thinking more of warming causing the release of further CO2 & CH4 from nature, which are themselves “forcings,” unlike water vapor, which is only a “feedback” (see http://www.realclimate.org/index.php?p=142) and not also a forcing.
Water vapor feedback doesn’t concern me as much as warming causing the release of further CO2 & CH4 from nature.
Question: Have these “positive feedbacks” which are also “forcings” (CO2 & CH4 emitted from nature due to the warming) been included in the models, or is there not enough accurate info on them to include them at this point?
Comment by Lynn Vincentnathan — 2 Nov 2005 @ 9:26 PM
Re 81 by Armand:
Ah, now I think this is getting somewhere! For McIntyre fans it is important to determine how the hockey stick is generated in the reconstructions. I suspect that for most, including RC, the fact that various methods produce the hockey stick is a validation of the original work (especially since no demonstrably better method contradicts the hockey stick, yet).
Gregor Mendel appears to have “cheated” (knowingly or unknowingly) in his work on pea genetics; yet his conclusions hold up. What is more important for population geneticists or for people interested in evolution? For people who want to say that evolution is crap, perhaps pointing to this early work (or to Java man or whatever) is the best strategy. For those interested in understanding evolution, however, reanalysing Mendel’s work can only yield so much.
I suggest that the same is true for people interested in the phenomenon of global warming, especially given the quality of the paleo data (bristlecones and such). Whereas for some people, the method used in a study 7 years ago is a potential “soft target” on which to focus, many others would rather focus on the best reconstruction and whether the original conclusion has held up. Thus, for the latter group, the relevant question is not as Armand stated.
I am curious about why only Eli has written regarding reasons the MBH98 method may have been used (why hasn’t Mann justified it?), but I’m more curious about whether or not recent temperatures are anomalous and why. Because the result is robust, it seems understanding subtle nuances of PCA methods is not necessary to satisfy that curiosity.
Re #89: The result is not robust. The various supposedly “independent” reconstructions are not in fact independent either in authorship or proxy selection. There are important defects in each such study individually with proxy quality and robustness with respect to outlier results.
[Response: At the moment, this looks like wild assertion / mud slinging. Given that the various reconstructions are the same on the important points, it seeems that the major conclusions are robust. Asserting that everyone else is wrong and only you are right is implausible - William]
Yes, I suggested that one obvious stopper (I hesitate to call it a negative feedback) is that you lose positive feedbacks – for example, once all of the ice has melted, the ice-albedo positive feedback stops. Indeed, this alone is probably largely responsable for the limit on how hot things can get.
Of course, a 6-7K rise on any kind of human timescale would be catastrophic.
You can overmoderate, that’s also why yahoo climatesceptics doesn’t work.
I think this blog is not a proper format for discussions as users cannot spawn threads.
I do consider light moderation essential, that’s why I prefer ukweatherforum over sci.environment.
Re #91, yes, I read that the end-Permian extinction (when up to 95% of life died) happened in a 6 degree C warmer climate than now.
The more limited positive feedbacks, such as ice-albedo, would perhaps help put the climate at a somewhat higher temp, at which, say ocean clathrates would start melting at a much higher rate & that would perhaps make the warming spiral up until all those were melted. I’m also aware that we have about 200 years worth of coal — so if all that were rapidly burnt up by voracious, energy-hungry Homo sapiens (don’t know if “sapiens” would then an acurate descriptive term), then that would also help push up the average global temp to its maximum.
There are human positive feedbacks. It gets hotter, we buy ACs and run them longer, which is somewhat moderated by reducing our heating in winter. Except that, I think (not sure) the climate is expected to increase in variability. Is that right? The mean fairly steadily increases in a jagged fashion, but the range or variance or standard deviation from cold to hot also increases???
I have to thank you folks for educating me and those out there reading this blog who don’t participate.
Comment by Lynn Vincentnathan — 3 Nov 2005 @ 11:13 AM
Andrew, not disputing your assertion, just still trying to understand the explanation. I see that the change in albedo is finite, and that there is no reason to expect a “runaway” GHG effect, where the positive feedbacks themselves continue to send temperatures above the 6-7K increase.
But, even absent positive feedbacks, wouldn’t a continued redoubling of CO2 in the atmosphere potentially continue to drive the temperature higher? I’m not saying this is likely – that obviously depends human behavior. I’m just wondering what the outcome would be if CO2 were produced at a much faster rate than currently forecast, and if this faster rate were to continue for a couple of hundred years.
For those upset about RC’s moderation, I think it’s great real scientists from among the top climate scientists in the world, are sharing their knowledge and most current understandings with laypersons. I see RC as a climate change course for non-majors. Sometimes we students whisper to each other in class, or bring in some non-physical science considerations; sometimes guest professors participate & help teach. Sometimes the host-professors bring up important topics off the syllabus (like I bring up GW in my sociology & anthropology classes). Wow, it’s quite like other educational fora.
I want the RC hosts to correct us “students” when we’re wrong or unruly. This is a serious blog about an extremely serious topic. I know it stings a bit when they correct us (I’ve had a few wrist slaps myself), just as when teachers correct students in class or mark them off on tests. This blog isn’t so much for argumentation and debate, as it is for presenting scientific truth (scientifically accepted range of truths) as it currently stands, stochastic and subject to change as it may be. And this is not a place for scientists to meet & share knowledge so as to produce and advance the field. I think they do that by reading each others’ articles, attending conferences, and personal communications. This blog was created for us students.
It’s better to go to other blogs if what you want is all voices to be heard equally or explore the Zen of GW abatement — marklynas.org is a good one. If you want to learn about climate change, one of the most pressing issues of our time (perhaps of all time), then stay tuned here.
Comment by Lynn Vincentnathan — 3 Nov 2005 @ 12:55 PM
#94 – “But, even absent positive feedbacks, wouldn’t a continued redoubling of CO2 in the atmosphere potentially continue to drive the temperature higher? ”
Another aspect of “greenhouse” warming that no one wants to talk about is the saturation of the IR absorption effect of CO2. The relationship between CO2 concentration in the atmosphere and IR absorption is not linear. Actually it is logarithmic and we’re already on the flat part of the log curve, so additional CO2 concentrations will add little to the absorption of longwave IR. That means a smaller and smaller contribution to any warming from CO2, but don’t let the public hear this – they might not want to ban SUVs.
[Response: Stating something very clearly in the IPCC report would seem to be a funny way of keeping this 'secret'. By the way, logarithmic is never flat. -gavin]
Comment by nanny_govt_sucks — 3 Nov 2005 @ 1:34 PM
I know that the temperature response to CO2 is logarithmic, not linear. This is why I used the term “redoubling”. So, again, even though positive feedbacks will eventually shut off, isn’t there potential for an increase in temp that is greater than 6-7K, if fossil-fuel burning were to significantly exceed what is currently predicted for a protracted period? (Gavin, please feel free to field this one. I’ve been curious about what happens, theoretically, with the 2nd doubling of CO2, perhaps out somewhere in the next century, and find very little information about it. Thanks.)
Thanks, Thomas (#96), for the PP slides. It answers a lot of my questions, & I shared a few slides with my mythology class, along with a “What Would Noah Do?” email I received yesterday. We just started the flood myths today, and those sort of fit, since we included “big bang” with the creation myths. (Note: myths are considered true to those who hold them, the equivalent of science or history, and they speak and are relevant to the present, though they happened in the past.)
Comment by Lynn Vincentnathan — 3 Nov 2005 @ 5:07 PM
RE Gavin’s response in #98. So where are we on the logarithmic curve re IR absorption? Would the tangent line be more toward the flat side, or steep side? Or rounding the bend with a moderate slope?
Even if more toward the flat side, we’d still want to reduce GHGs, Nanny, because every little bit of harm hurts. But instead of banning SUVs, why not make them electric, or hybrid (I think some are), or hydrogen? Or move closer to work/school. Or run multiple errands. Or at least keep them tuned and tires properly inflated.
[Response: It's self-similar, there is no flat bit. Instead of forcing increasing by +Y (w/m2) for every +X in CO2 (ppmv), it increases by +Z for every *(1+X) in CO2. Ie, the forcing increase from 1*CO2 to 2*CO2 is the same as from 2*Co2 to 4*CO2. Etc - William]
Comment by Lynn Vincentnathan — 3 Nov 2005 @ 5:50 PM
#101 – “Even if more toward the flat side, we’d still want to reduce GHGs, Nanny, because every little bit of harm hurts. ”
Harm? What harm? The connection between GHGs and extreme weather has not been shown. Droughts are likely cyclical. Other than a slight rise in sea levels over the course of 100-200 years, I see virtually all positive aspects to warming. A warmer, wetter CO2-fertilized planet will be and has been a boon to plants and the animals (including humans) that feed on them. Longer growing seasons mean more farm production. Expanded fauna habitats mean enhanced biodiversity. Green is good. Cold, freezy planets are bad. More info:
Earth is becoming a greener greenhouse http://cliveg.bu.edu/greenergh/nontechsum.html
“Our results … indicate that the April to October average greenness level increased by about 8% in North America and 12% in Eurasia during the period 1981 to 1999.”
“the growing season is now about 12 Â± 5 days longer in North America and 18 Â± 4 days in Eurasia”
Greening of arctic Alaska, 1981-2001 http://www.agu.org/pubs/crossref/2003/2003GL018268.shtml
“Here we analyzed a time series of 21-yr satellite data for three bioclimate subzones in northern Alaska and confirmed a long-term trend of increase in vegetation greenness for the Alaskan tundra that has been detected globally for the northern latitudes.”
Comment by nanny_govt_sucks — 3 Nov 2005 @ 6:47 PM
Sorry, there is nothing mythical about the Paleocene – Eocene Thermal Maximum.
Sorry you misunderstood the anthropological concept “myth,” which refers to truth, under which natural history, history, and science would fall – that’s our truth, thus our myth. Unfortunately in common parlance “myth” means falsehood or fiction. Just like some people might use “theory” to mean something unproven (as in “it’s just a theory, it’s a myth”). Falsehood or fiction is the stuff of folktales, novels, science fiction….
I’ve been alerting my students to AGW and more recently to past extincitons from GW as reality, not fiction. For the first time in 15 years since I’ve been talking about AGW, they are beginning to take it seriously, even though I’m the only professor I know of on campus talking about it. So, if I can squeeze it in somewhere in my courses, I do. Too bad I’m not teaching science — that would be good place to teach it….and I heard 3 years ago sci profs here were not teaching it; maybe that’s changed since.
Comment by Lynn Vincentnathan — 3 Nov 2005 @ 10:38 PM
“Overall, the future trends are likely to be further increases in stem mortality and recruitment rates, with mortality overtaking recruitment, leading to declines in stem density at an increasing number of sites. Simultaneously stand-level growth is expected to asymptote or decrease and biomass losses from mortality continue to rise. As a result stand biomass may well decrease in surviving old-growth forests well within the current century. While there is considerable uncertainty on the future trajectory of the drivers and the responses of tropical forests, the expected changes in the drivers plausibly predict that the current C sink contribution of mature tropical forests to buffering the rate of climate change is very likely to diminish and quite possibly reverse in the future (e.g. Cox et al. 2000).
“The drivers of change we document can be grouped into four categories: those caused by
“(i) decadal-scale natural climatic oscillations;
“(ii) fossil fuel emissions and resulting climate change;
“(iii) increasing industrialization in the tropics; and
“(iv) the further integration of forest products and land into expanding national and international market economies.
“To slow or halt widespread changes in remaining tropical forests will require large reductions in fossil fuel emissions, a different form of development in tropical countries than has occurred in temperate nations, and a minimum of carefully regulated markets. Under â??business as usualâ?? conditions, rapid global changes will continue to alter the worldâ??s remaining tropical forests with global consequences for biodiversity, climate and human welfare.”
Remember also that studies of this sort may not consider large-scale climatic shifts such as the Amazon basin switching from rain forest to savannah, which would involve a very large scale loss of carbon to the atmosphere (and would take place by means of severe droughts just like the one happening now).
If you sincerely want to learn more about this aspect of the science, use Google Scholar and note in particular the “cited by” link that allows you find the most recent work. Survey articles in the major pubs within the last two years or so are probably the best bet for getting the state of the science, although unfortunately the article titles don’t always make it obvious which articles are the surveys.
Abstract: “Drier summers cancel out the CO2 uptake enhancement induced by warmer springs
“An increase in photosynthetic activity of the northern hemisphere terrestrial vegetation, as derived from satellite observations, has been reported in previous studies. The amplitude of the seasonal cycle of the annually detrended atmospheric CO2 in the northern hemisphere (an indicator of biospheric activity) also increased during that period. We found, by analyzing the annually detrended CO2 record by season, that early summer (June) CO2 concentrations indeed decreased from 1985 to 1991, and they have continued to decrease from 1994 up to 2002. This decrease indicates accelerating springtime net CO2 uptake. However, the CO2 minimum concentration in late summer (an indicator of net growing-season uptake) showed no positive trend since 1994, indicating that lower net CO2 uptake during summer cancelled out the enhanced uptake during spring. Using a recent satellite normalized difference vegetation index data set and climate data, we show that this lower summer uptake is probably the result of hotter and drier summers in both mid and high latitudes, demonstrating that a warming climate does not necessarily lead to higher CO2 growing-season uptake, even in high-latitude ecosystems that are considered to be temperature limited.”
So, perhaps not quite an improved planet.
(RC team: I don’t recall and couldn’t locate an RC post on this subject, but it seems like a good candidate for one.)
William (response to my comment, which was posted, then commented on, then deleted…btw, huh?):
I’m not asking for an unmoderated board. I know that ad hominems and trolling are forbidden…here. I am asking to be allowed to make points and arguments on the hockeystick issue. To be allowed to debate and make converse arguments in keeping with the blog policy.
[Response: If your comment was deleted, then you stepped over the invisible line. Have another go, but more carefully... - William]
Warmer doesn’t mean everything will grow more. All ecosystems are adapted to specific environmental conditions. Changes in climate cause force ecosystems to adapt mostly by shifting geographically, and if they are not able to shift individual species and ecosystems become locally or globally extinct.
A good summary of the scientific literature on climate change and ecosystems is here http://www.pewclimate.org/docUploads/final%5FObsImpact%2Epdf
Climate change in Alaska is creating problems for the communities there. Melting permafrost is causing erosion and major structural damage to buildings, roads and oil drilling facilities. Warmer temperatures have caused trees to die and forest fires have become serious problems as a result.
Comment by Joseph O'Sullivan — 4 Nov 2005 @ 10:15 AM
Re #102 (on greening the planet by CO2)
The “greening” effect of GW depends on what time scale one is looking.
If you look at the effect of raising temperature at the time scale of millions of years, then warming might indeed lead to increased biomass, extended tropical forests and higher biodiversity.
GW as it happens now, however, is an abrupt shock into which ecosystems and species are not likely to be able to respond by adaptation. Moderate warming is likely to lead to moderate biodiversity loss, and excessive warming is likely to cause mass extinction. See for example a recent Nature article by Thomas et al., according to which “[...] 15â��37% of a sample of 1,103 land plants and animals would eventually become extinct as a result of climate changes expected by 2050.”
In the absence of truly catastrophic warming, biodiversity is likely to return, but that will take 1-10 million years. So if you care about species richness, not about “greenness” in the sense of palm trees in your backyard, this is serious stuff.
the Beeb today points out that the rate of ecological change is faster than the rate of many species’ adaptation. Also, NPP as measured has a limited benefit to humans, as there is evidence that graminaceous crops are less nutritious in an increased CO2 environment, and it is well-known that rising temps decrease yields, as most row crops are at the upper limit of their heat tolerance [as the recent cool season/hi yields in US and heat wave/lo yields in Europe attest].
We don’t know when or where ecosystems will flip. This, in itself, is enough reason for serious discussion of policy action – this requires energy, energy that is wasted on picking nits off of an old paper.
Mike, thanks for the comment, have reread several of the studies mentioned, but it still seems to me that the effect of solar is underestimated in the models. As can be seen in Fig.6 of the poster by Briffa e.a., the impact of large volcanic eruptions is in average some 0.1 C over the whole period 1400-2000, with a few quieter periods (1490-1570 and 1750-1790). This is based on tree rings (but I wonder if the fact that there is less incoming sunlight after an eruption, also has an impact on tree rings, besides temperature).
That means that the rest of the temperature decrease during the LIA was caused by less solar activity (and some from a CO2 feedback on the resulting ocean temperature decrease). For the MBH98 reconstruction, the residual LIA low is app. -0.2 C, for Moberg, this is -0.7 C, a factor 3.5…
As can be seen in Fig. 6 of what Gavin and you wrote, current models seem to overestimate the influence of volcanic, thus probably underestimate solar influences. Anyway, the past fit of the models point to a higher impact of solar, depending of the chosen reconstruction, and consequently a projection to a lower side of the IPCC range…
CO2 concentrations in the earths atmosphere hover around the 300 ppm (parts per million), 0.0003 or 0.03%.
The human contribution to this is around 2%, or 2 parts in a hundred, or 0.02.
Can anyone explain to me how 0.000006 (0.0003*0.02) or 0.0006% is in any way a material number in the context of the earths atmosphere?
And how a reduction of the human contribution by say, 10% of the total annual contribution, or 0.000006 (0.0006%) would make any meaningful difference to anything?
[Response: CO2 levels have been well above 300 ppmv for quite some time now. http://en.wikipedia.org/wiki/Image:Carbon_Dioxide_400kyr.png will provide you a nice graph and a link to the data sources. As you can pretty clearly see from that graph, and other measurements also show, humans have increased CO2 levels from pre-industrial (280 ish) to now (380 ish). Also covered by RC. So all your numbers are wrong. As to the basic science, try this - William]
This is caused by a decrease of cloud cover in winter (opposite to an increase in summer) over the Arctic. Cloud cover seems to have the largest influence on climate, be it that it is itself a feedback on drivers like solar and GHGs and related/unrelated oscillations. In this case it looks like a negative feedback on Arctic (or global?) warming.
The full article by Wang and Key in Science mentions the cloud radiation changes as follows:
“The wintertime net cloud forcing, which is primarily longwave forcing, has decreased at a decadal rate of -4.50 W/m2 (decreasing warming effect) in response to the decreasing cloud amount and surface temperature” and
“During summer the large solar flux, increasing cloud amount, and decreasing albedo result in a trend toward increasing cloud cooling by -5.17 W/m2 per decade. There is also an increasing cloud cooling trend in the fall, largely a result of decreasing albedo, because there is only a weak trend in cloud amount. The annual trend in net cloud forcing is -3.17 W/m2, indicating an increasing cooling effect.”
Compare that to the change in radiation balance by GHGs, which is less than 1 W/m2 in the past decade(s)…
[Response: If your figures made sense, the decrease in Arctic sea ice wouldn't be happenning - W]
Where have you been the last 50 years? :) The CO2 concentration is not howering around 300ppm but is around 380ppm now, and raising 1.5-2ppm per year. Accordingly, human contribution to the current atmospheric CO2 concentration is about 36%, not 2%. To reassure youself of this, use the Google search engine for ‘co2 atmospheric concentration’ and read the short RealClimate discussion on the topic.
The bottom line on the greenhouse effect is that without it the earth would be about 33 Celcius degrees cooler than it is now, and that CO2 plays a significant part in this difference. Small concentrations of CO2 are significant for energy balance just like small concentrations of CFC’s are significant for the ozone layer, or, to use an even more extreme example, small concentrations of crack in the bloodstream are significant for cognition. Large effects do not always require high concentrations!
In the absence of a better model you could assume the effect of CO2 concentration on temperature linear. Then, as a first approximation, doubling the CO2 would double its current warming effect. If the current contribution of CO2 to the 33K warming is 20%, doubling the original CO2 concentration would raise global temperature by 6.6K. About the relative importance of CO2 and other gases, google for “climate forcings”.
In reality things are much more complex, of course, but I hope this simple line of thought helps you to understand the significance of CO2.
Here’s another fallacy from a comment on that same site.
“However, melting of the arctic ice sheet would have virtually no effect on sea levels since most of this ice essentially is already floating in water (remember Archimedes jumping out of his bath tub and running through the streets shouting Eureka!). The arctic danger to sea levels would be if the Greenland ice sheet started to melt in a major fashion, since that ice (like Antarctica) is over land.”
[Response: This looks like the usual confusion over words. The Arctic *sea ice* is indeed floating, and its melting would only affect sea level a tiny bit. There is no Arctic ice sheet, unless they have renamed the Greenland ice sheet - William]
This same commenter was easily convinced that “realclimate was partisan” mind you. This is the complete up-is-downism being used against sound science today. It’s very discouraging.
I’m not a physical scientist like you guys, a fish biologist, but this sounds completely offbase like the rest of the propagandists’ arguments. Where would the water go if not adding to the volume of water in the connected oceans?
William, the influence of more CO2 is working everywhere, including (and even more at) the Arctic.
[Response: Repeating the same evidenceless assertion won't make it any more true. I asked you for some sources for it - do you have any? - William]
This should lead to more retention of IR radiation in winter, thus warming, but the trend is more cooling (-0.34 C/decade at >60N, -2.2 C/decade at >80N, see Science), due to less clouds in winter. This is more than compensated by a temperature increase in all other seasons, the yearly trend is +0.57 C/decade in the past two decades (but the trend is tempered by more clouds in summer).
The cooling in winter is large enough to refreeze most of the ice that was lost in summer, but not all: from 5 million km2 in 1940 to 7 million km2 in 2005. As summer ice loss is increasing more rapid than winter refreezing, the overall trend for all months is negative.
Albedo changes are ambiguous in this case: less albedo means more warming in summer, but also more cooling in winter.
[Response: Why more cooling in winter? - William]
But again the yearly trend is negative in albedo and heat balance.
Thus while clouds have an impressive influence on Arctic ice extent – compared to GHGs – the overall balance remains negative, probably as result of increased albedo, which in its turn may be caused by increased advection of heat from lower latitudes.
All together a challenge for current GCM’s which have problems with cloud feedbacks…
[Response: You are (as so many skeptics seem to be) obsessed by the idea that almost anything other than CO2 must be causing changes... As to the challenge: indeed: GCMs simulate the retreat of Arctic ice - William]
#116- Ferninand’s theories don’t match reality, it is very cloudy now in darkness – no cloud albedo to worry about. The clouds now do the opposite as when present during the summer, they keep a lid on heat trying to escape to space, therefore October was very warm in the Arctic (won’t be surprised if it is #1 warmest Arctic temperatures in history), these early winter clouds are caused by mainly open water, and of course cyclonic activity, it is not uncommon to see a cloudy “High” pressure system from space, mainly because of open water interacting with cold air. It is customary for the clouds to clear by late December to beginning of February, this is the time when the atmosphere gets really cold up here, fog and clouds return in full force usually by mid-April (spring ice break up). Arctic summer times are usually cloudy and that was correctly stated.
Apparently envirotruth is another industry-funded shill site:
The National Centre for Public Policy Research (NCPPR) is a US conservative think tank which advocates “free market solutions to today’s environmental challenges” on the basis that “private owners are the best stewards of the environment.” The NCPPR has set up a website, EnviroTruth.org, which has the expressed aim of “injecting badly needed truth into the debate about our environment.”… However, the NCPPR, and specifically the EnviroTruth website, have received funding from oil company ExxonMobil, which must appreciate the research denying climate change. ExxonMobil has also given funding to sources used by Envirotruth: the Science and Environmental Policy Project (SEPP), and Dr Fred Singer, as “an active contributor in the battle against the ‘politicization’ of science” through the Hoover Institution and Atlas Economic Research.
In the absence of a better model you could assume the effect of CO2 concentration on temperature linear. Then, as a first approximation, doubling the CO2 would double its current warming effect. If the current contribution of CO2 to the 33K warming is 20%, doubling the original CO2 concentration would raise global temperature by 6.6K. About the relative importance of CO2 and other gases, google for “climate forcings”.
Moderators where is your comment, ever since Arrhenius we know the relationship is logarithmic, recently confirmed by the simplified expression:
dE=[alpha]ln([CO2]/[CO2}orig), where alpha is 5.35 (Myhre et al.) http://www.grida.no/climate/ipcc_tar/wg1/222.htm
The same law that tells we have a warming of 33K by greenhouse gases (Stefan-Boltzmann) also tells us that a doubling of CO2 leads to less than 1 K.
- From the calculation in Modtran (unfortunately only for the subpolar regions, not the poles): winter difference 330-375 ppmv CO2 (roughly 1970-2005): 0.32 W/m2, summer difference: 0.49 W/m2. A substantial change, but almost a magnitude less than what the change in cloud cover introduces.
- Sorry, it was in fall, not in winter. From Science:
“There is also an increasing cloud cooling trend in the fall, largely a result of decreasing albedo, because there is only a weak trend in cloud amount.”
I don’t know how that works, maybe they refer to the fact that clouds have more effect over open sea than over ice for SW reflection?
For winter, there is a decrease in albedo too (for the sun-lit parts 60-76N), but the effect that clouds have may be masked by the fact that there is a decreasing trend in cloud cover…
- I am sceptic about computer models in general (own experience) and the applied magnitude of the different actors (including feedbacks), not about the basic effect of (human induced) greenhouse gases themselves. That the GCM’s can simulate the Arctic ice retreat is nice, IF they also simulate the observed change in cloud cover (and the resulting difference in summer/winter trends). If not, then they just have been lucky, but not reliable for any future projection…
Wayne, you are right about the role that clouds have, but the Science article mentioned before says that the observed trends in cloud cover show a decadal increase of 3.1% and 1.5% in spring and summer, little trend in autumn and a decrease of 5.7% in winter. The spring and summer increase leads to less warming than without cloud cover changes and the winter decrease leads to lower temperatures and more refreezing. Fall cloud cover may retain more (summer) heat, but at the other side reflect more (decreasing) sunlight. I have no idea where the balance is going if fall cloud cover is larger this year…
#125 Ferdinand, from experience the yearly seasonal increase in clouds (at mid april) were phenomenal and cloud coverage persisted from that time till end of October. What happened in 2005 was equally interesting, an apparent net decrease in March/April clouds played a contributing role in warming things up. While fall of 2005 was mostly cloudy, but a little less then 2003-2004. Things are not quite ‘normal’ this year.
Stephen, it is not about absolute figures, but about the difference in trends between summer and winter. Until 1945, there was no trend in summer or winter ice cover (as far as the data are reliable in the pre-satellite era) and the average difference between summer ice and winter ice cover is some 4.5 million km2 over the whole period 1900-1945. After a small increase and a sudden downward shift in 1953, the summer ice trend 1955-2005 is -2 million km2, while the winter ice trend is -0.5 million km2. In 2004/2005, this leads to more summer melting and more winter refreezing for in total some 7 million km2. Thus while there is more summer ice melting, in winter most of the Arctic summer melt refreezes. Of course, the larger summer ice melting lowers the yearly averages, even if all water would refreeze in winter.
The interesting part is that both summer warming and winter refreezing are largely influenced by cloud cover changes, which need more investigation (probably AO induced now, but what will happen if there is a shift in AO index?).
There is indeed such a mechanism, the carbonate-silicate-weathering cycle first recognized by Walker et al. in a famous paper in 1981. Hart (1978) had suggested that continuously habitable zones around Solar-type stars were extremely narrow and thus habitable planets might be rare. Walker et al. pointed out that when CO2 and temperature go up, so does evaporation, thus weathering, thus CO2 deposition in the ocean, so it acts as a brake. When CO2 goes down and the Earth cools, ice covers land and evaporation decreases and less weathering takes place, so volcanism builds CO2 back up again. Applying this, Kasting (1993) was able to show that habitable zones around Solar-type stars were actually quite wide. Earth-type planets far from their primary star (not ridiculously far, but say to 1.7 times Earth’s distance for a primary with 1.0 x Solar luminosity — Forget and Pierrehumbert 1997 — will not have runaway glaciation as in the old energy balance models of Budyko (1969) and Sellers (1969). They will be warm because of lots of CO2 in their atmospheres. (Humans won’t be able to breathe such air, by the way — the high CO2 will induce narcosis. But life that evolved there will do just fine.)
But this doesn’t mean that such natural feedbacks will constrain an artificially rapid rise in CO2 like we’re seeing now. The difference in time scale is crucial. By overusing fossil fuels we can bring the temperature way up fairly quickly. In the long run things will stabilize, but as FDR said, people don’t live in the long run.
As the unidentified commenter mentioned in comment #118, I’ve finally got a handle on how to communicate with you. I mistakenly used “arctic ice sheet” instead of “arctic sea ice”; I think I can clear up that matter with Mark.
Let me get to the point of this query. I posted several questions on the comment thread of the GM Corner post in question, and Mark suggested I contact you folks directly. I hope this is the appropriate thread for it.
1) Do other climate models which predict future temperature changes use the same proxy data set and methods that you used in the MBH model? Or to put it another way, are there other truly independent published calculations of future global temperatures using different data sets and/or methodology? If so, this would make the whole MBH versu M&M controversy of much less importance than skeptics are attaching to this controversy — in which skeptics seem to assert that the scientific consensus in favor of a genuine warming trend of historical concern rests squarely upon the validity of the MBH calculations.
(If you discussed this on previous postings that included specific citations, feel free to refer me to those rather than wasting your time rehashing the same points here.)
2) What are the signal to noise ratios regarding CO2 (and CH4 and other GHGs) versus other inputs? That is, how sensitive are climate models to detecting the magnitude of global temperature change in response to GHGs compared to the variability of other sources of warming (e.g. solar cycles, air-sea interactions, albedo changes)?
3) What are the relative magnitudes of anthropogenic GWG effects versus the error bars?
Again, if you have previously dealt with these questions, please point the way.
[Response: You seem to be a little confused. MBH is not a computer model in the sense you mean at all, it was a methodology for reconstructing temperatures in the past. All of the attribution studies to explain the 20th Century warming, and to extrapolate how warm it is likely to get are based on GCM models. (see here or here for more info). If MBH disappeared, very little about the current consensus would change (see here). The relative magnitudes of the effects of well-mixed greenhouse gases (CO2, CH4, CFCs, N2O) are very large compared to intrinsic variability, and all other forcing factors (except aerosols, and volcanos) - at least in the global mean temperature diagnostic (it's not so clear regionally). You can assess that for yourself by looking at some recent modelling studies such as described here. -gavin]
I’m a layperson, who has read the relevant articles and responses in this debate extensively, and I still don’t quite understand why tree-ring time series that differ significantly from a mean centered on the 20th century should be more significant than others (hopefully I’ve stated that coherently.) Why is the mean for the 20th century more relevant than a mean averaged over the entire period? Is this a part of the stepwise analysis–that is, 20th century data is more accurate than data prior to that period? Is the idea, then, that series which more faithfully replicate the other 20th century indicators are then taken to be more accurate for past periods than other series?
Re#118: “Where would the water go if not adding to the volume of water in the connected oceans?”
In regard to the ocean-based ice: put some ice in a glass and fill the rest of the glass with water until it’s close to full. Keep an eye on the water level as the ice melts.
When the ice melts, the volume of water it “adds” to the surrounding water is counteracted by the volume of ice that is lost during the melting process.
The wording on that site may be poor, but it is true that the melting of ice currently floating in the ocean would little-to-no effect on sea levels while the melting of land-based ice such as that in Greenland and Antarctica could have a major impact.
Comment by Michael Jankowski — 17 Nov 2005 @ 7:46 PM