I was wondering if anyone could perhaps help me to understand the mechanism through which the AMO (Atlantic Multidecadal Oscillation) could have effects on global temperatures. A recent paper has found that it is contributing to the current warming (termed IMP) by the paper.
I know there are a lot of climatologists out there who are not so big on the AMO but it seems that there is a plethora of evidence out there that it is a dominant driver outside of just the North Atlantic.
If the models were to be shown to be specifically deficient in some area and need significant rework what impact would you see that having on the thousands of papers that have relied on them to this point and of climate science in general?
[Response: And if the moon were made of green cheese, what impact would that have on space science in general and on the astronauts who walked on it? Please don’t play games. – gavin]
Robert @2 — Thanks for the interesting link. Is there a published paper for which you can provide the citation?
I’m certainly an amateur at this. It seems that the AMO is a index for MOC rate variation (amoungst other things). So the AMO gives some guide as to the interaction between the deep ocean and the rest of the climate system.
Comment by David B. Benson — 21 Oct 2010 @ 7:49 PM
I am interested in citations for the seminal works in radiative transport through the atmosphere, as well as a link to any documentation on the most widely used transport models.
I misspelled my first name in the name box: It should “Harold” How do I fix this?
Comment by Harold Pierce Jr — 21 Oct 2010 @ 8:24 PM
Gavin says “And if the moon were made of green cheese, what impact would that have on space science in general and on the astronauts who walked on it? Please don’t play games. – gavin”
Whether the recent discovery of a possible deficiency in the models turns out to be significant or not, this is a legitimate question to be asked and answered. Think of it as disaster mitigation needed for science. We do it for business, I don’t see why science should be exempt when science has tied itself more closely with results based funding than ever.
[Response: Coyly hinting at some super-secret deficiency you think you’ve discovered or read about, but not actually saying what it is, is just playing games. If you want to talk about something specific, do so. – gavin]
I was sent a link to an article (see below) that makes me wonder how “bad” is “bad? Do we really know if we are headed for “catastrophe”? How do we know?
How do we know things are “urgent” and we must put limits on fossil fuel use? How urgent is urgent?
When you read it focus on how bad things could get – i.e. catastrophe and not on the other junk in this article. The remarks about climategate or the hockey-stick are pretty irrelevant to what I am concerned about.
If we can’t convince people on how serious things are they won’t support any kind of regulation. Is there any proof we are headed for “disasters”. Suppose we do implement some form of regulation… will that really put a dent in curbing the progress of global warming?
I am sorry to sound so down but that article made me ask these kinds of questions.
Because water vapor trends are critical to the issue of positive feedbacks, the Dessler et al paper indicating that four out of five reanalyses show positive trends with warming is an important step in providing a realistic perspective. The one deviant, the NCEP/NCAR reanalysis, was the one effort that utilized only radiosonde data without availing itself of the newer and more accurate satellite based methods. The downward trend in that dataset has been analyzed by Yang et al, with evidence indicating that each downward step followed a change in intrumentation that diminished what were previously upward biases. In that sense, some of the mystery surrounding the spurious trend appears to be resolved. The link is: NCEP Reanalysis
The AMO and its cousin the PDO (pacific decadal occillation) have shown good correlations with the sinusodial type oscillations in the 20th century temperature profiles. I will to describe them in a nutshell. Both oceanic indices are termed positive when warmer water flows poleward, and negative when the warmer water flows towards the equator. The indices become more positive (negative) as the water becomes warmer towards the poles (equator). As warmer water flows poleward, it warms the winds that sweeep over the ocean, and warm the nearby land regions. During the periods of greatest positive indices, the land masses show higher temperatures (relative to recent negative indices). These indices tend to remain in one particular mode for extended time periods (decades). The PDO affects the ENSO events; positive PDOs lead to greater El Ninos and lesser La Ninas, and negative PDOs lead to the opposite. Positive AMO values lead to greater melting of Arctic sea ice. Some oceanographers claim that the these cycles are responsible for most of the observed warming. Prof. Latif has postulated that the switch to negative AMO indices will lead to 2-3 decades of cooling, followed by warming. I do not have the link handy.
Climatologists tend to dismiss these explanations as they minimize the CO2 affect on temperature.
The unanswered question is what causes these indices to change. Some speculate that they are tied into solar variations.
I hope you folks don’t find this one too far off the usual topics on RC, but I found it to be of great interest. On Oct 7th the following news was released.
Spire Semiconductor LLC has claimed achievement of the world efficiency record for a concentrator photovoltaic (PV) solar cell.
The Hudson, NH based company, said its cell was measured by the U.S. Dept of Energy’s Natl Renewable Energy Lab to have a peak efficiency of 42.3% at 406 suns AM1.5D, 25C (42.2% at 500 suns).
Spire Semi said this came about after it won a $3.7 million dollar subcontract from the agency in early 2009 to develope high efficiency triple junction, gallium arsenide solar cell (GaAs). A typical roof-top solar installation today has an efficiency of about 12%.
‘In less than 18 months, we were able to validate and incorporate our new concept into a production-ready cell design with world-record efficiency,’ said Ed Gagnon, general manager of Spire.
Gagnon said the higher efficiency cell platform is now commercially available to concentrator systems providers. This advance will help ‘to move solar energy ever closer to the goal of grid parity.’
The above news release seems to me to be big news. I’ve not read a thing in the MSM for reasons that I can’t understand. Does anyone have any additional information or comment on the above?
You always have to be careful saying this because there’s a lot of people waiting to distort, but a lot of researchers do feel some recent warming in the North Atlantic in particular has a lot to do with internal variability. There’s a lot of research into understanding the high frequency and multi-decadal departures of the Northern Hemisphere mean temperature from a uniform warming trend. This is particularly important for the attribution of hurricane trends and also decadal prediction. Note that there’s really no evidence for the long-term warming trend to have any significant component from internal variability
Try running a multivariate regression between Temperature (Y) with CO2 and the AMO (x-variables).
The predicted values versus the actual temperature values are pretty shocking and obviously flawed somewhere…
My understanding is that the AMO has an impact on global temperatures in themselves. The paper showed above (it is peer reviewed, the link to the journal is somewhere else) seems to say that the IMP or AMO variability is a dominant signal in current warmth on a global scale.
Indicates a predominantly positive AMO during the MWP, negative during the LIA and currently positive.
I am certainly not a skeptic but I am interested in how much the AMO is influencing global temperatures. It would be interesting to get some feedback on the mechanisms. I have my theory it has a somewhat solar origin over long time scales. Any ideas?
Gavin says “If you want to talk about something specific, do so. – gavin”
It looks like me being specific was just moderated into oblivion. Sooner or later we’re likely going to have to deal with the point I raised and I think it would be better to do it in a planned controlled fashion rather than in damage control.
[Response: Spare us. Models are used because they work, not because they are some pure deified output of our reasoning. No supposed deficiency takes away from the already demonstrated skill – how could it? Can they be better? Sure, but your imaginings of some huge looming crisis is simply fantasy. – gavin]
The fifth Mojave Desert giant solar project (600+MW from Stirling engines) has been approved by the Feds. The next CA State hearing will be Oct 28. Here are some nice photos from one of the organizations that opposes it.
Comment by Septic Matthew — 22 Oct 2010 @ 12:42 AM
(Comments by TimTheToolMan)
It seems to be common contrarian argument that somehow EVERYthing measured and researched and evident is somehow all dependent on modeling, which is ironic since the truth is it’s very much the other way around…?!
It’s difficult to understand sometimes how apparently not EVERYone has learning in recognizing classic logical fallacies, corrupted media, and simply unreasonable reasoning.
For my introductory survey on sea level rise, I’m about to draft a few pages on possible impacts on Chinese coastal cities. I’ve run through all the Google entries on this subject I can find but want to know if any list members can further enlighten me on it. If so, please do so off-list at email@example.com
[Response: I know what he is referring to, I’m ‘Gavin’ on that thread. But what I object to are coy hints of some looming issue that everyone is running scared from. That is simply spin for the unwary. This is not as exciting as the authors claim, and the response to their earlier claims is full of very good reasons why. – gavin]
It’s become a popular contrarian argument to propose that EVERTHING that is measured, researched, and evident relies on modeling, which is ironic since in reality it’s very much the other way around…?!
It’s difficult sometimes for me to understand how not EVERYONE has learning on how to recognize classic logical fallacies, media bias and corruption, and simply unreasonable reasoning. I guess what some say about our education system failures is true after all…of course they also say “you can lead a horse to water…” The difference there is, of course, that if you stop pushing the horse will soon realize it’s thirsty and stop being stubborn proud and take a drink…
Robert– I’m only skimming this paper but I’m not getting out of it what you are, and I also don’t see the utility in your multi-regression approach. Keep in mind that AMO refers to detrended SST anomalies. This is hard to do because there’s not really an easy way to decompose the signal and the residual (you need a model that provides a control, no anthropogenic forcing). But you should read this post from RC several years ago on this issue of the AMO since I’m not too well-acquainted with this stuff (maybe Dr. Mann can chime in) http://www.realclimate.org/index.php/archives/2006/09/tropical-ssts-natural-variations-or-global-warming/
Gavin says – “Can they be better? Sure, but your imaginings of some huge looming crisis is simply fantasy. – gavin”
This is why I’ve explicitely tried to dissassociate this discussion from any work on models.
So am I to assume that you dont think it would be a worthwhile discussion because you believe that the models will always be “valid” no matter what is discovered about them or the earth’s climate processes in the future?
And consequently any papers that use today’s models and come to conclusions based on the model’s results are equally going to be always valid into the future?
[Response: That is a ridiculous false dilemma, implying that if I think that models have been skillful, they therefore must be perfect and, presumably, incapable of improvement. What is wrong with ‘yes, models have been shown to be skillful, and yes, they can be improved, and will be in the future’? Obviously, this means that some results from today might be changed, but as I stated, where models have already shown skill, that doesn’t go away. And where models support conclusions from data, that isn’t going away either. So climate sensitivity is still around 3 deg C. Sorry. – gavin]
Part of the reaction toward you is based on your history, so don’t act surprised, but your question makes absolutely no sense. We have come to learn that the world we live in is governed by the laws of physics. A large component of climate models is built on these basic physical laws (conservation of mass, momentum, energy, etc). This provides us with great diagnostic and predictive power which has repeatedly proven to be incredibly useful. Perfection is not the goal.
We can understand key features of the atmosphere (including the climate and the sensible weather) just from these really basic principles. For example, if you have more divergence of fluid in a column of atmosphere at the top than at the bottom, then you have to move air upward in line with pure physical intuition and conservation of mass. For synoptic-scale meteorology, I can then tell you this rising air will probably mean a cloudy day over you. If the air is sinking, the skies will be clear.
If you create a horizontal temperature gradient (say between the equator and pole) then through basic manipulation of momentum and hydrostatic equations we can deduce that the wind speed must increase with height in the atmosphere. This happens, manifested as jet features. You can see it if you rotate a tank of fluid no bigger than your table (MIT has excellent experiments in rotating tank lab online). If you pour dye into the tank you will see a circulation that looks like a Hadley cell. Turn up the rotation and you’ll see that energy is transferred by eddies and waves instead of a nice overturning cell, as what occurs in the mid-latitudes.
Operational satellite meteorology heavily relies on radiative transfer, and the theory and relationships used in models being useful. Spectral features and lapse rate effects is how we see into the atmosphere and distinguish between the surface and the top of a cloud. Stellar astrophysicists use these same principles to find out information about stars.
Why am I rambling about all of this and throwing factoids around? There are many, many examples of simple physical lines of argument that give incredible insight into how atmospheres work. I encourage you to take some quantitative-level classes in climate or meteorology to really convince yourself (and fascinate yourself) with how much physics works. GCM’s take this physics, countless pieces of it, and provide us with invaluable information. That models are all wrong is just not possible. It’s a hypothesis, but easily demonstrated to be false. That said, that models are imperfect is also easy to demonstrate. The goal of the scientists now is to show if these deficiencies matter (which is certainly dependent on the question of interest, the radiative forcing for CO2 for example has nothing to do with a small deficit in model precipitation over western Wisconsin). Furthermore, what matters scientifically may not be interesting enough to matter for policy.
If we fix a cloud parametrization and get better droplet formation or something, then does Joe down the street really care? If we improve hurricane track forecasts though, people will probably care. For climate change, there’s lots more (regional precip changes for example) that people care about which can be improved, but what do you think is going to change about CO2’s role in temperature change? Do you think some magic stabilizer which has never worked in Earth’s history, that models don’t develop, that no observations have ever seen, etc is going to suddenly turn on and save us? That’s a bit more far-fetched than the moon made of cheese I think.
Gavin, on a “perhaps” related topic to #15 above, I have been looking at the eqns in the CAM3 description related to moisture and convection to get a better handle on it. I am not a modeller, but my question is pretty basic. Do most of the current models use similar procedures for entrainment/detrainment in convective air masses as used in CAM3?
Since nobody at Realclimate seems to have much faith or enthusiasm for the potential of the Atmospheric Vortex Engine as a technology for mitigating or arresting climate change,in spite of the existence of natural “models” for it in nature, perhaps someone here might suggest here the name of a researcher or company or “Angel” who might have both the interest and budget (~$100,000) to actually build a captive waterspout about 50 feet in diameter, for which I have developed a simple design.
I trust that if one could be built that exhibits impressive results, I could get one or more of you on board, along with myself and Don Cooper of Australia, as an official Endorser this potentially “Messiah” technology as one that at least merits even further developmental efforts–if not by “our” government, by the one who is currently eating our breakfast, lunch and dinner (you know–the one who now emits as much greenhouse gas as we currently do).
I heard a rumor that a paper to explain why Potential Dependence of Global Warming on the Residence Time (RT) in the Atmosphere of Anthropogenically Sourced Carbon Dioxide by Robert H. Essenhigh http://pubs.acs.org/doi/full/10.1021/ef800581r#references
is wrong where on it’s way… however that was some time ago, any one have news on that?
Tim the Toolman,
OK, Tim, ‘splain me this: Where on Earth did you guys get the idea that if somehow the models were found to be flawed, the climate crisis would go away? Do you think temperatures would somehow miraculously return to pre-industrial levels, that the glaciers would come back, that the pH of the oceans would be restored, and that all the extinct species would come back?
There is an astounding amount of evidence that global temperatures have risen rapidly in the past 40 years. It is happening. There is overwhelming evidence that CO2 is a greenhouse gas, and that the climate’s sensitivity to it is more than 2 degrees per doubling (or do you think Arhenius used a computer model?) Indeed, climate models are instrumental in placing an upper limit on climate sensitivity! Without this upper limit, there would be even more reason to slam the brakes on CO2 now.
Now, there is no evidence that climate models are seriously wrong. None. However, I’ll never understand why the denialist contingent thinks their being wrong works in their favor. Uncertainty cuts both ways, and the blade is a whole helluva lot sharper on the high-sensitivity side than on the low.
Tim the Tool said:
“If the models were to be shown to be specifically deficient in some area and need significant rework what impact would you see that having on the thousands of papers that have relied on them to this point and of climate science in general?”
This seems like a a leading question and you know the answer you want.
The science would continue in whatever direction, like science always does.
However you seem to be searching for a result that satisfies your views, which isn’t exactly unusual in business.
I question whether you are interested in the science or are more interested in outcomes.
It sure appears as Dessler pretty much demolished Lindzen. However, I would like to hear if somebody could elaborate more about the uncertainties in the four feedback “components” in the classic sensitivity equation he showed. (water vapour, lapse rate, ice albedo and clouds). Dessler showed nicely how water vapour was positive and did a very nice (the best I have ever seen) demonstration of how Lindzen is highly unlikely to be right with the cloud feedback (80% likely to be positive, most probable value 0,15, uncertainty intervals -0,04 to +0,34). This gives pretty much the sensitivity intervals around 1,7-4,5C.
However, I would like, just for the record, to see some uncertainty intervals around the values he showed for water vapour (+0,6), lapse rate (-0,3) and ice albedo (0,1). If anyone can point me to such material, then I would much appreciate it.
Comment by Christoffer Bugge Harder — 22 Oct 2010 @ 5:59 AM
When discussing the AMO you first have to keep in mind what the AMO is: It is simply the detrended SSTs of the North Atlantic. Therefore it represents a priori the evolution of temperature itself and not any physical process which influences temperature. If you detrend global temperature you get pretty much the same pattern as the AMO. This pattern of global temperature can be explained by a combination of different influencing factors over time (increasing solar activity plus moderatly rising greenhouse gases until 1940; cooling through strong increase of aerosols / fading of solar activity increase / still moderate GHG rise until 1970; strong increase of GHG emissions / reduction or inversion of aerosol increase and the related cooling after 1970). There is no reason which a priori suggests that the temperature evolution in the North Atlantic should be significantly different from the global one. If there is a high correlation between temperature in the North Atlantic (NA) and global temperature, this can be due to a reaction of both geographical entities on the same external influence (which seems not unlikely…), due to an influence of the NA on the global scale, or due to influence of the global evolution on the NA. There is a paper (Elsner 2007: myweb.fsu.edu/jelsner/PDF/Research/Elsner2007.pdf) suggesting that an influence of global temperature on NA temperature is more likely than the other way round.
Nevertheless, there is some evidence that the AMO, i.e. in fact NA SSTs, might be influenced by variations in the Meridional Overturning Circulation (MOC) of the ocean (Delworth and Mann 2000, Vellinga and Wu 2004, Knight et al. 2005, Latif et al. 2004). Climate models produce natural variations of Atlantic SSTs due to MOC variations at a similar timescale as the AMO, but with a wide range of frequencies (50 – 130 years), compared to the AMO frequency of 50-70 years. The subtraction of the greenhouse warming effect calculated by climate models from global temperature exhibits a geographical pattern, which is similar to the one attributed to the AMO (Kravtsov and Spannagle, 2008). Furthermore, Baines and Folland (2007) have shown that a number of signals over the globe can be attributed to North Atlantic cooling (an thus NA SST variation). Thus it seems likely that there is some signal related to MOC variations, which seems to have an influence on natural variations of the global temperature.
However, these natural variations linked to MOC mainly explain residual natural variability which is superposed on greenhouse warming and do not in any way represent an alternative explanation of ongoing global warming. Furthermore, there is no observational evidence that the observed AMO (and recent NA SST increase) is actually linked to real MOC variations, since we don’t have any useful observations of recent decadal MOC behaviour so far. The suggested link is only based on a rough similarity of frequencies of two signals without the possibility to position the modelled variation on a real time scale.
In view of all this, I won’t give much on any predictions related to the AMO. There are less than two AMO cycles in the instrumental period (which makes the frequency band estimation very uncertain), and reconstructions (e.g. Delworth and Mann 2000, Gray et al. 2004) show similar substantial frequency variations (40–150 years) than the models (50-130 years). If we don’t know if the next AMO cycle starts after 40 or after 150 years I won’t bet on a cooling during the next decades.
I read the article to which you linked. I think it describes the position of those who believe that CO2 is partly responsible for the observed warming, but that other factors were also involved. Warren Meyer divided global warming into two theories (although I think this was more to make his point than that the theories are seperate). The first is that the additional CO2 in the atmosphere will lead to warming due to chemical and physical properties of the gas. This is generally believed by all (except for a few deniers), and often referred to as the “consensus.” The second part is that feedbacks dominate the climate systems such that the increased CO2 will lead to catastrophic warming. This is were the skeptics and alarmists differ. In fact, there is a wide range of opinions within the skeptics and alarmists as to the actual feedback effects. I have seen many people err in assuming that the first “consensus” can be transferred to the second. I have also seen the most fervent believers in global warming refer to those who subscrribe to theory #1, but not #2 as deniers, when in fact, they believe that CO2 has contributed to the observed warmer, but not that it will accelerate to create a catastrophe. From my encounteres, skeptics are usually better informed scientifically than either the deniers or the alarmists.
This is good reading for those who do not understand that there is a difference between those who “deny” global warming, and those to refer to themselves as “skeptics.”
[Response: Sorry, but this is simply an attempt to reframe the ‘center’ of the debate. The mid-range of climate sensitivity is around 3 deg C – it might be a little less, it might be a little more – but this is what you get if you actually look at how climate has changed in the past. It also happens to be the mid-point of what the climate models suggest – though you are welcome not to pay attention to that if you want. The idea that 1.2 deg C is the midpoint and only ‘alarmists’ think it is bigger is nonsense. A sensitivity that low makes it impossible to explain ice ages, hothouse climates, etc. So no, this is not the difference that defines ‘skeptics’. People who are in denial that there is plentiful evidence for net positive feedback are just the same as those who deny the evidence for rising GHGs, or the existence of the greenhouse effect, or the existence of internal variability, etc. etc. – gavin]
Dan H. (11): Actually no oceanographers claim that the AMO/PDO cause ‘most of the observed warming’. I would suggest that you read Latif before quoting him. Essentially all oceanographers believe that AMO/PDO are important decadal modulations of the global warming trend. The magnitudes of the temperature change caused by these modulations are on the order of 0.1 deg C vs a trend of about 0.8 deg C.
Since it is clear that you don’t actually have contact with climatologists, why do you claim that they tend to dismiss this oscillations?
Suggestion for future posts: regional climate change. Pick some example and go deeper about risks, likely consequences, uncertainties. Amazon, southern Europe, Africa, US would all be interesting subjects.
And thanks for the time and attention you dedicate to this forum.
[Response: Good suggestion on a very important topic and thank you for the compliment; it is appreciated.–Jim]
…but that article made me ask these kinds of questions.
I don’t see how it could. The article itself is just a morass of denier tripe. I don’t see how you can separate out the “catastrophe” talk from the rest of it, because its all tangled together into a mess of misinformation.
The basic point seems to be (1) even if AGW is true, “they” are exaggerating how large the temperature change will be and (2) “they” are exaggerating what the effects will be by attributing every disaster to AGW.
We’ve all seen this tired stuff before and it doesn’t even merit any further discussion. In a nutshell:
(1) No, the evidence for climate sensitivity (2˚C-3˚C per doubling CO2) is very strong, and a lot of different lines of reasoning all lead to the same conclusion, while “hoped for” denial lines of reasoning have all been repeatedly debunked. The claim that warming will be limited to 1˚C because the positive feedbacks will not materialize is unsubstantiated wishful thinking which is quickly swallowed by the willfully, eagerly ignorant. You can’t help those people. If they were told that the Flying Spaghetti Monster would never allow the planet to warm more than 1˚C, they’d abandon Christianity in a blink, at least as far as AGW goes.
(2) The attribution of disasters to climate change, or rather the lack of clear, justifiable attribution of noteworthy events to climate change, is a recent, common denial tactic (and that’s all it is, a tactic).
First, everyone who understands the problem knows that we are talking about the impacts 30 to 100 years down the line, with the understanding that if we don’t get the problem reasonably under control now, it will be too late for the next generation.
Second, no one with any credibility is saying that warming to date or in the near future is going to create category 6 hurricanes or melt all of the ice in the Arctic year round. Exaggerations like that are sometimes made by the media or deniers themselves, by quoting the end points of ranges from scientific statements, but this is a combination of media hyperbole and the willingness of deniers to highlight those end points to try to ridicule the science (and it works with the willfully ignorant).
Third, many of the scientific references to attribution to current events are merely studies. They are just scientists who are saying “this is interesting, I wonder if this is caused by warming, and if so, is there any scientific way to prove it?” The fact that someone initiates such a study does not mean that such attribution exists or is provable. It’s just useful science to perform at this point in time.
So what you have is a falsehood (that warming will be held to 1˚C), a distraction (pointing to the extremes in ranges, or media hype, or new studies interested in determining attribution), and a misdirection (trying to convince people that a long term problem should be ignored because they don’t see the effects yet).
Of course this is all capped with the last denial meme, that the proposed solution is to destroy the global economy, which is absolutely not the case. All we need is moderation, and to get started in a meaningful way on the problem now. The only way it will destroy anything is if we wait twenty years to do anything at all about it.
So as I understand it the AMO, PDO, ENSO are all just ways that the ocean re-distributes its heat. Thus they influence the Weather in land masses near them. Global warming on the other hand is the increase in the TOTAL heat content of the ocean and the land masses. The increase in the TOTAL heat content of the oceans may affect the strength and periods of the ocean heat redistribution networks.
TimTheToolMan’s fantasy ties into a discussion I had recently about scientific upheavals.
For Tim’s benefit, here is the bottom line:
Yes, sometimes the text books are rewritten. But this is very rare, and when it happens, we don’t throw out everything that came before. Classical mechanics may be “wrong” now we know about relativity. But classical mechanics still serve us well – Newton is taught every day in schools, and classical equations are constantly used by engineers for real-world problems.
So what if a fundamental problem is found with climate science? No scientist can rule this out with 100% certainty, because that’s not how science works. But if a problem is found, then we will need new explanations for many, many questions that are already neatly answered by our current integrated, self-consistent understanding of climate. Questions such as what causes ice ages, why the planet is not an icy snowball, or where all the carbon we burn disappears to will suddenly have no answer. Are you really so arrogant that you think that climate science got absolutely everything wrong?
Can climate science be overturned like this? No. Any new revolution will be like Relativity, building on what came before. And most actual changes will be small, and within existing uncertainties.
It’s like a jigsaw. There may be missing pieces still, but we can already see the big picture. Therefore, we can be absolutely confident that most of the pieces are in the right place.
#9–David, I’m afraid I can’t give much credence to an article that says:
Further, few skeptics deny that man is probably contributing to higher CO2 levels through his burning of fossil fuels, though remember we are talking about a maximum total change in atmospheric CO2 concentration due to man of about 0.01% over the last 100 years.
Any reasonable person would think that .01% referred to CO2, not total atmospheric content! For “atmospheric CO2 concentration” the correct (ie., “non-misleading”) number would be 30%+.
But as to how do we know we have problems:
1) Climate sensitivity has been much-studied by multiple methodologies–modeling, paleoclimate analysis–and keeps coming out at about 3C, plus or minus 1.5. Some basics:
2) This summer saw weather disasters that cost tens of thousands of lives and tens of billions of US dollars. We can’t definitely attribute the specific events to AGW, but they are just what we may expect to see more of, according to multiple studies. And it remains a real possibility that these events might not have happened without the effects AGW. So it’s possible, but unprovable, that we just experienced a couple of “climate catastrophes.”
3) Coincidentally, we just yesterday were taking note of this study on another RC thread:
Look at the Drought Index figures carefully. You will note that the forecasted values in many instances are considerably more severe than those experienced in the infamous “Dust Bowl” years, commonly called the “Dirty Thirties.” Those years triggered a massive internal migration in the US, reflected in works such as “The Grapes Of Wrath,” “Of Mice And Men,” and a good deal of the work of Woody Guthrie. (NB-A great-uncle and great-aunt of mine were among those “droughted out” of Saskatchewan–after about 20 years of back-breaking labor to establish their homestead. It must have been heart-breaking to have to return East and beg family assistance.)
This article puts the number of US citizens displaced at 2.5 million:
Oh, I should have mentioned the Gwynn Dyer book “Climate Wars,” which considers security implications of CC. Another denialist tactic is to carefully avoid thinking about second-order consequences (or “knock-on effects,” in the British idiom the linked article uses.) But the “real world” has no such compunctions about its actual workings.
I reviewed “Climate Wars” here, in a piece which will give the gist:
“[I]f you want to discuss something you read in the media, saw in a press release or just wanted to ask about, this is the time.” Apologies if this has been covered at RC already and I missed it, but I’d be grateful to learn RC’s take on that Wikipedia climate-science stuff that the Wall Street Journal was crowing about the other day in an editorial headlined “WikiPropaganda.” The editors praised Wikipedia for what the editors characterized as editorially disempowering energetic opponents of climate skeptics. I haven’t seen coverage elsewhere, so I can’t even tell what actually happened. Setting aside the WSJ editors’ views, I wonder if RC scientists think that Wikipedia has acted properly, sensibly and fairly. Thanks.
Comment by Steven T. Corneliussen — 22 Oct 2010 @ 8:26 AM
I do not know where you got the idea that someone thought the midrage for climate sensitivity was 1.2! Neither my post nor David’s article made that claim. The midrange of the models is 3, but with a rather wide range, moreso than just “a little.”
I fail to see how the article was attempted to “reframe the center of the debate.” It was an attempt to define the group referred to as “skeptics” as somewhere between the deniers and the alarmists. The idea that anyone who thinks the climate sensitivity is less than 3 is a denier is akin to calling everyone who is not a liberal democrat a “teapartier.” There are many staunch believers in global warmer who claim that the climate sensitivity is between 2.3 and 3. But you are free to call such people “deniers.”
btw, how does climate sensitivity relate in any way to the ice ages?
[Response: the temperature of the planet at equilibrium is expected to be a function of the drivers of climate change (CO2 sure, but also other GHGs, ice sheet extent, solar input, aerosols etc). So at any past equilibrium (such as the last glacial maximum) the drivers and temperature are related by the climate sensitivity – see kohler et al (2010) for an up to date calculation. The bottom line is that it is impossible to explain how cold the ice ages were if sensitivty is small. – gavin]
My wife and I have spend the last 25 years converting (actually letting it convert itself)our 10 acres of open field to native deciduous forest. I’m just curious about how much total carbon we have “captured” to this point and how much our little forest captures each year. Can anyone point me to some links that will help me do these calculations? Thanks.
[Response: Jeff, email me some more details and I’ll see what I can do.–Jim]
I’ve seen some of the future-climate predictions showing some areas around the world will be getting drier, while others may be getting more rain fall. I’m wondering if anything can be said about average wind speed. With more internal energy in the environment, due to higher temperatures, is it possible that average wind speed around the world, or in a given area, may be going up as well?
This has a practical implication for those of use who live in a relatively low-wind area. The economics of putting up a windmill in my area of Ohio are borderline, but still cheaper than solar panels (at this time). If I thought average wind speed were going to be a little higher in the future (it wouldn’t take much), it might be enough for me to do it now.
When you consider the reality that’s staring us in the face – 3 degrees C, not 1 degree – and the consequences we are already experiencing from the 1 degree we already have, then yeah, I think we’re looking at a dyed-in-the-wool climate disaster. That kind of warming will have a huge impact on the polar ice mass (both, not just the Arctic), further rise of sea levels, rainfall pattern effects, etc. Now add in the ocean acidification and the effect it’s having on the shell calcification of critters that are at the base of the food chain, and 100 other things you can find on this site and elsewhere. Add in a healthy dose of human nature and consider the things that we’ve fought wars over during our entire written history – land, resources, and food – and you have the potential for some catastrophic effects on society. That’s not even going into the long-term effects on the millions of other species with which we share the planet.
We don’t “know” this, we can only make our best estimates, and try to deal with it. As far as taking action to curb the progress of global warming, I think we can, I just don’t think we will. But, that’s only my opinion, of course, and hopefully I am wrong.
So as I understand it the AMO, PDO, ENSO are all just ways that the ocean re-distributes its heat.
I would rephrase that to say that they are patterns of the periodic redistribution of heat in the ocean, not actual ways to redistribute it. They aren’t mechanisms, they’re temperature observations, or rather, repeating patterns detected in those observations.
And that’s the first two mistakes in any effort to attribute warming to these measurements.
First, without a mechanism, without understanding what’s going on, all one is doing is correlation without causation. I could just as easily argue that warming is caused by the sky being blue, because the sky has been blue since I can remember, and the world has been warming in that same time frame. [And as a proper skeptic, I have no evidence that the sky was actually blue before I was born, so you can’t fool me with tree ring proxy studies which falsely demonstrate the color of the sky prior to 1960.]
Second, as their names imply, they are oscillations. They have an up phase and a down phase. An oscillation in and of itself is not going to get you anywhere. It would be like getting on the swing in the backyard as a way to get to work and beat the traffic.
I am glad to see that you are familiar with some of the great literature about the 30s. I am not sure how UCar came up with their future drought index, but appears to contradict the idea that water vapor and precipitation will increase due to warming. According to the graph, it looks like >90% of the planet will experience drought conditions by 2100. This sees rather implausible, especially considering that warmer times on this planet has consistently led to wetter conditions. Personally, I would not put much faith in these type of predictions. Also, to what weather disasters are you referring that cost the U.S. tens of thousands of lives? Are you referring to the Russian heat wave, Pakistani floods, and South Aerican freeze? The 2.5 million displaced U.S. citizen amounted to about 20% out the population of the affected states (Texas north to the Dakotas). Many studies have not shown an increase in droughts during the recent warming period, but that drought conditions are affected largely by ENSO conditions. This may tie into the posts about PDO and AMO.
I agree that Warren Meyer was misleading with his assertion that CO2 only rose 0.01%. While accurate in that the atmospheric concentration increased from roughly 0.03 to 0.04% (a 0.01% increase), many uninformed readers would assume that he meant a percentage increase.
[Response: A little bit of learning goes a long way here, and the IPCC report is your friend as far as that is concerned. If you read the relevant section of the AR1 report (chapter 10.3) , you will find that projected increases in continental drought (i.e. decreases in soil moisture) are far more widespread than changes in precipitation alone might suggestion (see figure 10.12 of that section). This is primarily because warmer soil and vegetation evaporate/transpire greater amounts of moisture into the overlying atmosphere, leading to drying. Remember, drought isn’t purely a function of the flux of water from the atmosphere to land (i.e. precipitation) but the difference between that flux, and the flux of water from the land into the atmosphere. That imbalance can easily become negative (i.e. drought) due to warming alone, irrespective of changes in precipitation. -mike]
No, a projected increase in atmospheric water vapour and precipitation does not contradict a projected increase in droughts. Here’s the start of the paper, which is freely downloadable (follow the links from Kevin’s comment):
“Drought is a recurring extreme climate event
over land characterized by below-normal
precipitation over a period of months to years.
Drought is a temporary dry period, in contrast to
the permanent aridity in arid areas. Drought occurs
over most parts of the world, even in wet and humid
regions. This is because drought is defined as a dry
spell relative to its local normal condition.”
Increased temperatures will increase the “atmospheric demand” for water vapour, increasing the amount of evaporation (this will be particularly marked over land, as land will heat up more than the ocean). This increase could outweigh an increase in precipitation. Moreover, despite what you say about earth history, the period of rapid warming since 1950 appears in fact to have coincided with a reduction in precipitation over large parts of the globe’s land surface.
I am currently teaching an Honors-level class “Global Warming: Myths and Facts”. The students are an impressive group and apparently did not know the topic or title of the course until they went to buy their books.
Some were EXTREMELY SKEPTICAL the first day of class. Once they delved into the literature, their skepticism shifted from the science to the deniosphere.
Together, we are reading 4 books…
Kolbert’s “Field Notes from a Catastrophe”
Archer’s “The Long Thaw”
Hoggan’s “Climate Cover-Up”
Dyer’s “Climate Wars”
(I figured that this would give them 4 very different looks at the topic)
…and they each present an article every week from a peer-reviewed journal related to the course. I told them that I would give anyone an instant A if they could find a peer-reviewed article that refuted the science of ACC (anthropogenic climate change) that hadn’t itself also been refuted. I also told them not to believe a word that I tell them and to look in the literature for themselves. It’s working like a charm.
They’re each also doing their own literature reviews on a variety of topics. Boy are they productive!
We haven’t started “Climate Wars” yet, but I just wanted to let the RC-heros know that Archer’s book is so-far the hands-down favorite and I plan to use it again in several courses in the future. Perhaps I can write a review some time.
Those who haven’t read “The Long Thaw” should do so at their earliest convenience. It lays out the science in a very digestible and clear way.
Gavin and Nick,
I fully understand the concept of droughts and their predictions for the 21st century. I was countering the UCAR study which predicted ~3/4 of the globe would experience severe drought conditions in 2090-2099. Also, the rapid warming since 1980 (not 1950) has not conincided with reduced precipitation over large parts of the globe. The recent (~30 years) droughts pale in coparision with those earlier in the century and in prior centuries. As I mentioned earlier, scientific studies have found that drought conditions are more closely tied to ENSO measurements than atospheric temperatures. If global warming were to affect ENSO conditions, then I will concede my point. Also, I do not need your condenscending attitude Gavin about learning.
1st TimTheToolMan say
“If the models were to be shown to be specifically deficient in some area and need significant rework ”
If the models will shown to be specifically deficient in some area and need significant rework I’m sure Gavin will put a post in RealClimate abut it. Mean time we have to use what we have and make rational decisions
2nd There are election in the US and citizens should do their part by: voting, put signs, comment in the media on the good and bad guys, drag their neighbor to vote …. The world need You!
To Magnus (#25) – The claim that excess CO2 remains in the atmosphere for only a few years is one of those enduring myths that persist in the blogosphere despite the very substantial evidence that it is false. Essenhigh’s paper would never have made it into a reputable climatology, geophysics, or general science journal.
It’s important to distinguish between CO2 “residence time” (RT) and the time required for elevated CO2 levels to decline toward baseline levels. Individual CO2 molecules (including C14 tracers) exchange rapidly between the atmosphere and oceanic or terrestrial reservoirs, with an atmospheric RT of about 5 years, as Essenhigh states, but the exit of CO2 molecules from the atmosphere is almost completely balanced over such a short interval by entrance into the atmosphere from one of the other sources. With anthropogenic emissions, the imbalance is even in the other direction – i.e., net increase into the atmosphere.
If one looks at the actual intervals needed for an atmospheric excess to subside, their duration is orders of magnitude longer, and can’t be characterized by a single “half-life” because the decay is composed of multiple exponential or quasi-exponential functions. On a short time-scale (a decade or two), CO2 equilibrates with the upper mixed layer of the ocean in the sense that an equilibrium is approached involving the dissolving of CO2 gas as a function of its partial pressure, conversion to H2CO3, and equilibration between bicarbonate (HCO3-) carbonate (CO3–) and H+ ions. However, equilibration over the entire depths of the ocean requires centuries. This is not the end of the story. The above equilibria respond to added CO2 via a reduction in carbonate (to take up the extra hydrogen ions), and this buffering capacity of carbonate is restored through the dissolution of CaCO3 in marine sediments to provide the needed carbonate ions. This process consumes millennia. The final stage – restoration of CaCO3 stores (the form that completes the carbon cycle via subduction into the Earth’s interior via subduction zones) requires the “weathering” of terrestrial silicate rocks via H2CO3 in rainwater, so that the resulting bicarbonate can flow to the sea. This process can be simulated in laboratory experiments and asssessed via changes in rocks over millions of years of geologic time. Its decay curve involves hundreds of thousands of years, leading to the assertion (for this part of the cycle) that essentially, “CO2 is forever”.
A useful link is to David Archer’s discussion at CO2 Lifetime
David Archer’s ‘Global Warming: Understanding the Forecast’ is also worth a look as is ‘Climate Change: A Multidisciplinary Approach’ by William James Burroughs. I hope to soon see an updated edition which could cover the GCR issue in more detail although the current edition does point out flaws in Svensmark’s assessment.
for climate sensitivity was 1.2! Neither my post nor David’s article made that claim.
And yet, from the article (really, the crux of the article…), in the sixth paragraph:
But one degree due to the all the CO2 emissions we might see over the next century is hardly a catastrophe.
And then in the very next paragraph (emphasis mine):
This second theory is the source of most of the predicted warming – not greenhouse gas theory per se but the notion that the Earth’s climate (unlike nearly every other natural system) is dominated by positive feedbacks. This is the main proposition that skeptics doubt, and it is by far the weakest part of the alarmist case. One can argue whether the one degree of warming from CO2 is “settled science” (I think that is a crazy term to apply to any science this young), but the three, five, eight degrees from feedback are not at all settled. In fact, they are not even very well supported.
Of course, the eight is a completely made up figure, and the five is near the high end of a range of several dozen projections, so he’s purposely exaggerating things to make his case seem more plausible. He should have just said “3.”
But there’s no question that the article says “one” and ridicules anything higher than that, and you are defending the article, whether or not you actually understood or remembered what was written.
btw, how does climate sensitivity relate in any way to the ice ages?
Gavin didn’t answer this in any detail, so very briefly, the known forcings which are believed to initiate the ice ages are themselves relatively small, or at least, insufficient to cause the observed temperature changes. This means that there must be positive feedbacks in the system to amplify those forcings.
Basically, the argument that climate sensitivity is low and that the climate of the earth is very stable requires that the history of the planet have a steady, unchanging climate… something that is clearly not the case. You can’t just say “well, yeah, but that was then, the thing is, the climate is really, really stable now.”
Dan H. said: “there is a difference between those who “deny” global warming, and those to refer to themselves as “skeptics.””
Can you spot the problem? You should be able to!
The problem is that nearly all people who deny global warming will refer to themselves as sceptics (or skeptics, if they prefer American English). Yes, there are some sceptics who do not deny global warming, and there are some deniers who do not call themselves sceptics – but the overlap is enormous.
Regarding the stilling paper fresh out on Nature Geoscience (Vautard et al http://www.nature.com/ngeo/journal/vaop/ncurrent/full/ngeo979.html) I was wondering if anyone would care to comment on the implications for water vapour in the lower atmosphere? I’m thinking lower wind speed over terrestrial ecosystems -> less evapotranspiration -> less water in the air above the canopy -> implications for the radiative forcing? As it was suggested that the reduction in wind speed partly could be attributed to an increased surface roughness following the increased net primary production from CO2 fertilization, I was wondering whether mechanisms like this are at all included in the GSMs, or if this is on a different scale? In case it is included, are the models predicting reduced wind speed over land? And finally, connected to my thoughts on water vapour above canopy, would a possible down regulation in modelled net primary production following an inclusion of a nitrogen (and phosphorus) constrain on modelled primary production as suggested by Wang and Houlton last year http://www.agu.org/journals/ABS/2009/2009GL041009.shtml) have any significance at all for the surface roughness and the amount of water vapour above canopy as modelled from evapotranspiration and depending on wind speed?
“Try running a multivariate regression between Temperature (Y) with CO2 and the AMO (x-variables).
The predicted values versus the actual temperature values are pretty shocking and obviously flawed somewhere…”
The principal flaw is in the approach above. Temperature is the accumulation of heat from the radiative forcing of CO2 and other things. In otherwords, temperature is the integral of the forcing, not a linear function of the forcings themselves. Therefore a simple regression is not going to work, even setting aside other issues, like omission of confounding forcings from aerosols.
It’s frustrating to refer back to a previous comment by number (e.g. Magnus at #25) only to find that new comments have been interpolated and the original number has changed (currently Magnus is at #27). I’m not sure why comments need to be inserted out of order, but perhaps they should be identified separately (e.g., 20A, 20B, etc., or simply numbered according the order in which they were actually posted).
Probably a bit OT even for an “unforced variations” thread, but what the heck. Here’s a little anecdote that illustrates the utter insanity that permeates the global-warming denial movement.
Last month, I went out to Denver (from Southern California) to visit relatives there. While I was visiting family in Denver, a heat-wave struck Southern California and downtown LA experienced an all-time high of 113F.
A couple of days later, when my mom was waiting in a supermarket checkout line, the man behind her mentioned the how nice and warm it had been in Denver (mid-high 80’s at the end of September). My mom replied that her son from Southern California was in town and was fortunate to miss the 110+ F heat-wave there.
Well, the man just went off on her — shouting and ranting about the “global-warming fraud”, Al Gore, the UN, blah, blah, blah. He continued to rant and yell at her as she pushed her grocery cart out of the market! It was just nuts.
And that’s why Ken Buck will likely become Colorado’s next junior senator.
What about the weak early solar paradox, or whatever it’s called? Lindzen mentioned this. Apparently, the sun was something like 25% weaker, but the earth was not frozen, and therefore climate sensitivity is weak. What’s the story on that?
…scientific studies have found that drought conditions are more closely tied to ENSO measurements than atospheric temperatures. If global warming were to affect ENSO conditions, then I will concede my point.
I think you are guilty of a rather bizarre over-application of Occam’s Razor in expecting a single system (ENSO) to be a totally dominant factor in water distribution around the planet. It’s not a very skeptical, or open-minded, approach.
I’m certainly no expert, but Hadley Cells offer an excellent example. In a nutshell (and please, I’ll eagerly accept correction from my betters) this is a constant and rather stable circulation of air rising in the (moist) tropics (near the equator), moving poleward, and descending further from the equator.
What this does is to move moisture from the tropics to the temperate zones, and at the same time to deprive the subtropics of moisture. This is why most of the world’s deserts and otherwise arid regions exist in those latitudes.
Climate change is expected to expand the size of the Hadley Cells, and I think the mechanics of how that might happen are fairly intuitive (if not, see the links at the end). It can clearly be understood that were this to occur, the deserts of the world (i.e. the arid regions in the hearts of the cells) would expand.
A quick google search finds a wealth of information of all sorts, from Wikipedia to research papers:
The Frobes article did not say that the total climate sensitivity was 1, but rather 1 degree of warming can be directly attributed to the CO2 molecules themselves. This value is usually quoted as between 1.0 and 1.2, and referred to as the no-feedback forcing (see Gavin’s other article on feedback for a further explanation). Warren Meyer then goes on to emphasize that the differences between the skeptics and alarmists is the warming attributed to feedbacks, and mentions that these values are less well supported. I am defending the article only in the sense that Warren Meyer is highlighting the differences between skeptics and alarmists. You may choose to draw the line between the two elseware, but I feel that he has make a fairly good portrayal of the differences. Btw, I have seen claims of up to 10C/doubling of CO2, so his figures, while seemingly tossed about randomly, are not completely made up. You may want to reconsider your statement about understanding or remembering.
Just because the “known” forcings of ice ages are relatively small, does not necessitate the need for large positive feedback. How many “unknown” forcing could be responsible. I agree that the Earth’s climate has never been “stable.” The relative stability during this recent interglacial has been interspersed with many warming and cooling, wet and dry periods. One example in the research has shown North Africa to be both significantly wetter and drier than at present since the end of the last ice age.
“Vegetation plays an unexpectedly large role in cleansing the atmosphere, a new study finds.
“Our results show that plants can actually adjust their metabolism and increase their uptake of atmospheric chemicals as a response to various types of stress,” says Chhandak Basu of the University of Northern Colorado, a co-author.
“This complex metabolic process within plants has the side effect of cleansing our atmosphere.”
“Once they understood the extent to which plants absorb oVOCs, the research team fed the information into a computer model that simulates chemicals in the atmosphere worldwide.”
“The results indicated that, on a global level, plants are taking in 36 percent more oVOCs than had previously been accounted for in studies of atmospheric chemistry.
Additionally, since plants are directly removing the oVOCs, fewer of the compounds are evolving into aerosols.”
“This really transforms our understanding of some fundamental processes taking place in our atmosphere,” Karl says.”
What do you suppose it does to the plants?
More needs to be said about water and trees on a massive scale, and the need to balance long standing environmental positions with relatively new challenges from CO2. I also want to discuss more about salt water, which I tend to think fails to meet the cost requirements for a truly large scale project, and it also brings limits on what can be grown.
However, respecting Gavin’s attempts to keep order in discussions, I hope others would move the forest discussion to the latest open thread, ‘unforced variations 3′.
I got into the water and trees topic as a thin thread, since it was what I called a feedback with humans as part of the loop.
I also jumped into water and trees to try to suggest that it would be better to not get too upset over the Cuccinelli thing; and instead get on with discussing real possible solutions.
These seemed to be timely discussions, and I appreciate RC attempts to provide a more open thread post. That was a good way to handle things.
It most certainly did, and I quoted the passages. How stupid do you think people are?
I am defending the article only in the sense that Warren Meyer is highlighting the differences between skeptics and alarmists.
And there is, laughably, no real distinction, as evidenced by the content of the article, which claims there is a difference, and yet goes on to itemize almost every ridiculous denial canard in history. Your inability to see this is part of your own denial. Don’t expect others to buy into it.
I feel that he has make a fairly good portrayal of the differences
You and all of the deniers, and yet the rest of us find it to be utterly ridiculous.
Btw, I have seen claims of up to 10C/doubling of CO2, so his figures, while seemingly tossed about randomly, are not completely made up. You may want to reconsider your statement about understanding or remembering.
I’m sure that you can find someone somewhere who said 100C. First of all, you and he are choosing end points in ranges. Secondly, you can find someone somewhere who said anything. So what? It’s also dependent on how much CO2. Are you talking about a scenario where we burn every ounce of fossil fuels available in the earth?
Don’t throw around “I heard…” That’s nonsense. Provide a citation to such a statement made by a climate scientist, as their median prediction, and show that it was well received by the climate science community. Anything less than that is you just making stuff up to suit the needs of your current position in an argument. It’s a waste of time.
How many “unknown” forcing could be responsible
Another constant denial tactic. “We don’t know” or “we can’t know” or “but what if it’s the Flying Spaghetti Monster that’s behind it all, how can you be sure?”
The information has been presented to you. Study it and learn instead of just making things up and hoping something sticks.
Prof. Latif’s report is not a lie. His report in 2008, and presentation in 2009 point to cooling between now and sometimes in the 2030s similar to that observed from the 1950s to aobut 1980. I am not talking about some global chill, but a slight decrease in global temperatures. He is not the only one, several others are using AMO/PDO data, and arriving at similar results. Your link also mentions that Prof. Latif believes that global warming is occurring, but that up to half of the observed warming was due to these oceanic oscillations. This is belief shared by many oceanographers, but not necessarily by other climate scientists.
I know this is not a blog for the masses, but could I humbly make a request for a *just slightly* more approachable editorial style?
An example: take a look at the “Solar Spectral Stumper” article below. The title is a teaser: “there’s a confusing thing here.” The first sentence is also: “this is puzzling.” Along with scepticism: “big if.” Then identification of “the case in question.” Then a look at what they did that does not reveal what they found that was so puzzling. Then a description of what they did. And so on. It isn’t until the third paragraph that the “stumper” is explained, and by then it doesn’t make any sense because I’ve given up having the particular background to make heads or tails of it. But I suspect heads and tails both could have been made for me by the writer.
Maybe try to come out and explain in lay-person’s language in the first sentence what is so puzzling and why it may be so important.
…because I thought what you were saying was that there was a distinction between deniers and skeptics, which there is not… they’re the same animal.
As far as a distinction between deniers and people who intelligently learn and understand the information, and use rational reasoning to arrive at logical conclusions, rather than latching onto any piece of rotten, half-floating driftwood they can to avoid being forced to face an unpleasant reality… well, that’s just silly. No one needs a lesson in that. It’s pretty obvious to everyone.
re: epistemology of climate modeling, that looks like an interesting collection with something for everyone.
I am of course curious whether you recognize Andrew Lacis and yourself as doing “postmodern, playful, superficial, extreme” science, as per Sundberg’s sociological take on Snowball Earth simulations.
[T]oday’s ensembles are not designed to sample representational uncertainty in a thorough or strategic way. Today’s multi-model ensembles are ensembles of opportunity, while perturbed-physics ensembles take no account of structural uncertainty. And in both multi-model and perturbed-physics studies, initial condition uncertainty is often given only cursory treatment.
[Response: Wendy generally knows what she is talking about, though I would quibble that ‘initial condition’ uncertainty isn’t assessed properly. That is done better than perturbed physics uncertainty for instance. Her point that the ensembles are ensembles of opportunity, not design is completely correct. – gavin]
No one said his paper was a lie. What they said was that the report about the contents of his paper, and what he said in his presentation, was a lie. The report actually says, in the extract:
Using this method, and by considering both internal natural climate variations and projected future anthropogenic forcing, we make the following forecast: over the next decade, the current Atlantic meridional overturning circulation will weaken to its long-term mean; moreover, North Atlantic SST and European and North American surface temperatures will cool slightly, whereas tropical Pacific SST will remain almost unchanged. Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.
How can you repeatedly get all of this stuff so very wrong, and not realize that you are listening to and believing the wrong sources, and that you need to be more skeptical, open your eyes, and learn a whole lot more without trusting the Watts and the Novas and the Meyers of the world?
Re the reply by “Rocco” in 57 to my Wikipedia-related query in 43: Thanks for the link. But isn’t that a discussion by William Connolley himself, the former RC scientist who is at the center of the Wikipedia squabble in question? And isn’t it written only for people who have followed the controversy closely, and who know the background, context and lingo? Let me re-phrase my original question: The WSJ opinion editors have been crowing about something involving Wikipedia and a former RC scientist. When the WSJ editors crow, I get suspicious. In this case, it seems important, if you agree that Wikipedia’s handling of science is important. So I’m hoping for two things:
* for someone (like a New York Times reporter, or for that matter a WSJ reporter, since they don’t work for the WSJ opinion editors) to explain in simple terms what it’s all about, and
* for RC scientists to say what RC’s take is.
[Response: Wikipedia is a world unto itself, and the decisions it makes in adminstrating itself are frequently very difficult to make sense of. As far I can perceive, there was a lot of infighting among editors on a set of climate change articles (usually these are because the contrarians keep trying to add unsourced and/or irrelevant material), and normal modes of resolution did not appear to work, and so they have banned everyone involved for a short period. A ‘scorched earth’ policy if you like. This has affected very sensible people (like William) and nutters alike. I don’t think this is a very sensible way to run things, but it’s not my encyclopedia. If this means anything, it means that more informed people need to get involved in Wikipedia and learn its ways so that the standards can be maintained. What the WSJ would like is for the science descriptions to be diluted with standard skeptic tripe so that people will not be easily able to check that the WSJ opinion writers are making things up ( which they do with regularity). – gavin]
Comment by Steven T. Corneliussen — 22 Oct 2010 @ 2:26 PM
A new citizen science climate initiative worth a look:
Just to clarify my question, I don’t think there is any prediction from AGW concerning tropical cyclones: water will be warmer, air will be warmer, there will be more total moisture in the air, and the air-land difference will be greater creating greater winds. If total rainfall is less over land, then it may also be less over water. I have not yet found a clear exposition of what, for example, GCMs may predict.
Regarding the AMO and trends, one point of the paper is that the contribution of internal multidecadal variability to trends in globally averaged SST depends on the length of the period used to compute the trend. If the period is short, say 10 years, internal variability can dominate the forced trend. However, if the averaging period is long enough, say 60 years, the contribution of internal variability to the trend is small. This means that interval variability cannot account for the century-long warming trend observed in spatially averaged SST.
Septic Matthew asks:”Just to clarify my question, I don’t think there is any prediction from AGW concerning tropical cyclones: water will be warmer, air will be warmer, there will be more total moisture in the air, and the air-land difference will be greater creating greater winds. If total rainfall is less over land, then it may also be less over water. I have not yet found a clear exposition of what, for example, GCMs may predict.”
I don’t think GCMs predict hurricanes well because of their local nature, but warmer waters are predicted, and hence more major hurricanes (category 3-5). Not more hurricanes. The Accumulated Cyclone Energy is so dominated by noise that I doubt that there is a way to derive an empirical trend with much confidence.
[edit – please keep comments about other posters to a minimum. Facts and cites are far more effective.]
After all Gavin has written about climate sensitivity, I fail to see how so many people keep getting it wrong. The direct, black-body response due to a doubling of CO2 has been reported as 1.0-1.2. The Forbes article to which David linked said 1. This is not the same as the climate sensitivity value with feedbacks included, which has a wide range of values depending on individual responses. The IPCC truncated the modelled climate sensitivity values at 10 for the AR4 report, granted there were very few above 8 (section 10.5)
You can call deniers and skeptics the same thing if you like. [edit – OT]
In your second link, you seem to think I got Prof. Latif’s conclusions wrong. I suggest you re-read what he said, wrote, and presented. He is not talking about the start of an ice age or anything, just a slight cooling for the next decade or two, before warming resumes.
If I believe someone has posted an incorrect statement, I will research it thoroughly, and re-read the post to ensure that I have not misunderstood the post, or in error. I suggest you do the same, otherwise you just appear silly and ignorant.
@425 JB: Dr. Al would almost certainly say you don’t understand exponents if you think cars are sustainable. They are not. Your statement they can be indicates you are not thinking whole system, nor on the appropriate scale.
First, you have to consider full life cycle for each and every component of every car and every component of every system it takes to get that car from ores to the driveway, then it’s full lifetime.
Second, scale. 7 billion people, headed for 9 billion, on a planet already overused by 50%, and all of them striving to live at the same level as the highest-consuming nation, the US. Have you read anything from Heinberg? http://www.youtube.com/watch?v=ybRz91eimTg
@429 Jim B.: No need to till. It also kills of soil biota and the billions of interactions going on that are actually responsible for growing the seeds you plant. Just say no. and, save yourself tons of time and energy.
Also, no need for a tractor at all in a world of small scales, but maybe sharing one in a community for fallen trees, etc., might be good. If you can make it sustainable.
@434 Ray: Actually, you don’t necessarily need anyone to water trees, you need good design that brings the water to the trees and/or stores what water falls in the soil. Trees are their own great funnels, so you typically only need to help out a bit with good design, including how you plant. The design will depend a great deal on the individual site, but swales, key line designs, etc., will keep water where it falls or move it where you need it. Look at some of the work done with earthworks for managing water.
@435 Jim’s comment: You raise a good point about geo-engineering, but some scale of land/soil/water management can be used to make the deserts sustainable for a given N population of not very large size. You also make me realize we should do some backcasting and inductive/deductive work and try to identify some of those “destructive” changes we should not try to stop. An interesting twist I’d not thought of before.
@447 RodB: You’ve been pretty well answered by the others, but let me say a couple small, possibly important things:
1. You are correct that organic is not necessarily environmentally friendly. You will see me speak only of regenerative agriculture, evidenced somewhat by the Rodale study I posted and, I believe, Jim quoted from. The US gov definition of organic is a joke.
2. Regenerative ag, as stated above, sequesters carbon in the soil. That coupled with the loss of soil carbon in non-regenerative ag, it’s a huge swing. If ag is *only* organic, it’s still an improvement over chem ag.
3. Soil with 1% organic matter will hold something like 37 liters or gallons per square yard; 5% increases that to something like 137. Think on what that means for water conservation and how little watering you’d need to do with a combination of healthy soil and good mulching.
4. A regenerative garden will be very biologically diverse because all elements interact, all elements are supported by at least two other elements, all elements have at least two functions and co-planting is a key principle. Our small garden (perhaps 30 x 40) had over 40 different varieties and/or species in it.
Those of you who are basing your assumptions on what you have seen over the last 7 – 10 decades have some re-learning to do. The marriage of old wisdom to new research and technology has shown us how to grow food with limited effort, zero chem inputs, with greater diversity and resilience and also helps point the way to solutions to climate and energy solutions, including how we arrange ourselves on this planet.
[moved to the open thread–everyone continue this discussion there please. Thanks. Jim]
Look, it’s no shame to be misled by sloppy press reports; you don’t need to defend them.
Latif did not “postulat[e] that the switch to negative AMO indices will lead to 2-3 decades of cooling”, as you said in #11, in his 2009 WCC presentation that I assume you are referring to. I did my homework on this at the time. In the WCC talk, he did seem to “point to cooling between now and sometimes in the 2030s” — as a scenario to illustrate the kind of surprises decadal fluctuations around a warming trend might throw up. A rather unlikely scenario, I think, but he was arguing the case for improving our understanding of the decadal scale. If by the “report in 2008″ you mean the Keenlyside et al. letter in Nature, it did not predict cooling — just a brief warming pause ending about now and a slower warming until the 2020s than some other simulations.
Latif never said decades (this is the Internet lie that refuses to die), he said “next several years,” which I take to mean anywhere from 2 to 5 (very far from 30). And do not get your info on this from either WUWT, or news sites like The Daily Mail or Times Online. They all very deliberately misquoted him, or took his statements out of context.
The Forbes article to which David linked said 1. This is not the same as the climate sensitivity value with feedbacks included…
You are exactly right in saying the direct effect of CO2 is 1C, and the cumulative effect of positive feedbacks, i.e. total climate sensitivity, is estimated to be around 3C.
But the question is what Meyers actually said. I’ll include his quotes once again.
But one degree due to the all the CO2 emissions we might see over the next century is hardly a catastrophe.
and more importantly (again, emphasis mine):
This second theory is the source of most of the predicted warming – not greenhouse gas theory per se but the notion that the Earth’s climate (unlike nearly every other natural system) is dominated by positive feedbacks. This is the main proposition that skeptics doubt, and it is by far the weakest part of the alarmist case. One can argue whether the one degree of warming from CO2 is “settled science” (I think that is a crazy term to apply to any science this young), but the three, five, eight degrees from feedback are not at all settled. In fact, they are not even very well supported.
He is very clearly refuting the impact of positive feedbacks on climate sensitivity. He is saying that CO2 will at worst directly cause 1C of warming, and that’s all, and therefore there is nothing to worry about.
There’s no way around it. It’s right there. You can’t make his words mean something else.
…for climate sensitivity was 1.2! Neither my post nor David’s article made that claim.
is therefore irrefutably false. Meyers did make that claim as the foundation of his entire piece, and as such that piece is complete drivel.
There’s certain key words and phrases people say which people who are familiar with the blogs have come to recognize as being indicative of someone not really serious in discussing the issues. One is someone who insists on describing scientists as being concerned with “catastrophic global warming.” Secondly is making sweeping categorizations between “skeptics and alarmists.” Another is taking a collection of thousands of people who have worked for decades and tossing them into one group who are trying to “dismiss other causes” which is indicative of a conspiracy. Another issue is over-reliance on what one professor somewhere said without placing this opinion in the context of the uncertainties in a very young field (decadal prediction, large uncertanties) and differing perspectives from other publishing scientists.
I’d suggest if you want to be taking seriously, you begin with learning how to properly frame the “debate,” what *serious* people are actually disagreeing on, what they are arguing for, etc. There’s just a certain standard for intelligent conversation irrespective of the science that you feel to be correct. Then, we can get to the technical details.
I’m talking about what would happen if the models/”our understanding of the atmosphere” was discovered to have a relatively major flaw, and this is quite different to acknowledging that the models go through incremetal improvements.
In my opinion this possibility represents a significant risk to a lot of our science to this point. A lot of science these days depends on those models having skill.
And so I guess it comes down to what you consider “skill” to mean and in what situation a model shows skill. My opinion is that if a climate process is modelled fundamentally incorrectly then that model actually has little to no skill for many/most questions its used to answer particularly those of “projection”
The model’s “skill” reduces to a curve fit skill.
Now I understand the usefulness of models and using them is a “damned if you do and damned if you dont” situation but I think that their use has become too pervasive with little acknowledgement of the dangers producing a result that is unverifiable.
[Response: GCMs are not curve fits. They have successfully predicted, the cooling after Pinatubo (before it happened) (and the change in water vapour, radiation, circulation), that the satellite data sets had to be incorrect, that ice age ocean temperature estimates were wrong, that climate would warm at about the rate it has from the mid 1980s, that the ozone hole would impact southern hemisphere winds, that teleconnections from ENSO would affect rainfall around the sub-tropics, etc, etc. etc. How do any of these achievements disappear if an adjustment is made now to the models? And doesn’t it seem a little unlikely that all of the myriad examples of models correctly matching some aspect of the climate are simply coincidental – even after they have been verified? And that weather forecast models work? You are asking about hypotheticals that just have nothing to do with reality. – gavin]
Comment by David B. Benson — 22 Oct 2010 @ 5:13 PM
Something called the paleoclimatic record tells us as much if not more than the models do alone. Hansen tels us that the models are interesting tools but the paleoclimatic record offers more evidence that backs up the models and bears out what Gavin is telling you.
More needs to be said about water and trees on a massive scale, and the need to balance long standing environmental positions with relatively new challenges from CO2.
I’ll have to do some re-reading, but trees should help hold water in the soils, and trees themselves, because they tend to funnel rainfall down to their root systems where it can soak into the ground. The root systems help hold soil in place, and the leafs falling, if deciduous, should be building humus, thus both holding more water and providing mulch to reduce evaporation. They also have interesting micro-climate and wind effects.
There are, however, large differences due to types of trees, latitudes, etc. Overall, should be a negative feedback to AGW.
One of the best things about trees is that they release CO2 when burned, but we can count what we need to burn and grow to balance, making wood one of the few truly sustainable fuels we have… though probably not for 9 billion people.
Some things don’t really need to be over-analyzed, do they? Basically, more trees is good. If a bunch of those trees are fruit and nut trees built into edible forests, so much the better. If a bunch more are fast-growing trees we can coppice for fuel, building materials, what have you, even more so. And if yet more are planted to restore old growth types of forests, well, we’ll have made some kind of impact on AGW while moving a goodly distance towards a sustainably designed society.
[Response: I agree with much of what you’re saying, but be very careful with conclusions on the topic of global effects of forests on the climate system. Hopefully we can do a post on the topic because people are clearly interested. In the mean time, everyone interested should definitely read this paper by Gordon Bonan, which has already been cited 168 times in ~ 2+ years.–Jim]
The temperature of the north Pacific is critical to pepper crops in Texas and winter visitors to Florida through the following mechanism.
The west to east flow of the Arctic currents melts the western ice, creates a large high pressure in the western Arctic and generates westerlies.
Recent failure of this system has resulted in two highs being generated over Hudson Bay and the Barents Sea north of Norway. These two highs cause a blocking action to the jet stream, resulting in a loop to the jet stream extending all the way to the Gulf Coast.
Weather maps of the spring of 2010 are still around showing frigid air from the Yukon to Florida. Winter of 2008/2009 generated 22 degree nights during the flowering of the pepper crop in southern Texas. Some beautiful pictures of snow in southern Mississippi were not appreciated as much.
This process, fluctuating water temperature in the north Pacific/north Atlantic, suggests one should also look SW of the Barents Sea.
Two stationary Rossby Waves caused the destruction of the Russian wheat crop via excessive heat and the flooding of the Indus Valley due to abnormal condensation of the monsoon moisture. Various organizations report as many as 20 million people dispossessed by the flooding. The Russians have stopped all exports of wheat and due to the drying of the soil, are waiting to see if winter wheat can be planted. When the decision of exports was announced to price of wheat went from $5.50 to $8.60 a bushel.
Other anomalies in this area include last winter’s 8 foot snowfall across Tibet and northern China resulting in a large loss of livestock and rail closures in China.
You seem to consistently miss some crucial points, most likely based on fundamental misunderstandings of how science is done. But also the general non sequitur nature of your reasoning.
What if Gavin turned out to be an alien from another planet? That would not overturn our understanding of how to build a road.
Skill is ability to be successful. The models are skilled and the physics matches up reasonably with the skill, and the quantitative addition industrial based GHG’s fits the amounts measured in the atmosphere, and the sensitivity is skillfully showing to be relatively correct, though when slow feedbacks kick in may prove to be to low, even at 3º C, and the effects on a planetary scale are also matching up to all of the above.
When you add that all up, the error bars get smaller. Think about the odds of so many disciplines all coming to the same general conclusion with separate analysis and data observations? The odds that they would all point in the same direction is actually pretty slim.
I’d bet that if you did the statistical analysis form that perspective, the likely hood that humans are responsible for the warming would weigh in at higher than 95%.
But I’m a common sense kinda guy. When looking at the preponderance of the evidence, I just don’t see a lot of flaws.
And when people such as your self talk about some mysterious flaw I have to wonder why we are not giving PhD’s to cabbage patch dolls.
Take a look at the scientific method and try to figure out how skill is achieved. Then you might realize the basic flaw in your argument. The skill is robust. Your argument is weak, and this global warming event is human caused.
I dont particularly want to get into an argument about that models can and cant do but as you’ve listed some examples we may as well look at a couple of them
Gavin says : “They have successfully predicted, the cooling after Pinatubo (before it happened) (and the change in water vapour, radiation, circulation), ”
And I would expect them to do so in some cases. Adding a forcing that doesn’t impact too severely on any “wrong bit” mightn’t cause the model to go too far wrong. And just to be clear, I’m not coyly implying any particular problem in this discussion, I’m talking generally.
You mention a couple of other successful feats of the models and they seem to be circular arguments to me. What else verified the satellite corrections needed to be made? Who was around measuring the ocean temperatures during the ice ages?
And then there are the hindcasting predictions you mention but aren’t the models built and tuned to most accurately model known and measured events?
I dont want to dis the models too badly. As a tool they’re great. Its just that they’ve now BECOME the science and that is at best too risky to building climate science upon and at worst potentially just plain wrong.
[Response: No, your last sentence is very wrong. The models, as in any science, are just attempts to synthesize and integrate what’s known about a system, and their purposes vary depending on the nature of the specific model in question. Climate science is way bigger than just modeling. Very fundamental–Jim]
One thing that should be borne in mind is that young trees take up vast amounts of water. Melbourne (Australia) is suffering from lack of water into its water storages because replanting, to replace logging, in watershed areas is taking out water that would otherwise flow or percolate through soils into creeks. This of course has been exacerbated by undertaking these activities during a drought.
If precipitation is expected to be more unreliable in the future, careful thought, much more careful thought, will have to be given to placement and rates of planting of trees in all sorts of places.
I’m curious what RC thinks of some statistical developments (hier. models) in paleoclimate. I have a hard time imagining the mean reconstructions will change much, but would it be a worthwhile investment to investigate its fidelity against Reg-EM (or whatever is popular now), instead of (say) focusing on regional reconstructions of the climate?
@Jim : “No, your last sentence is very wrong. The models, as in any science, are just attempts to synthesize and integrate what’s known about a system, and their purposes vary depending on the nature of the specific model in question.”
Well I disagree and there are a number of papers that have been produced that perform “experiments” wholy within models.
[Response: Of course! That’s one of the main things you do with models. It doesn’t contradict what I said, nor prove your point about their singular importance in climate science.–Jim]
Re 68 aphillips – There was likely generally more CO2 and CH4 (due to less O2) earlier in Earth’s history (Not familiar with the specifics but knowing some basic principles, aside from the negative feedback of chemical weathering that would tend to increase CO2 in response to cooler conditions over long time periods, and aside from the effects of evolution and the effects of climate and topography on organic C burial, more CO2 could be sustained by increased geologic outgassing). There have been some very cold periods in the Proterozoic eon.
Re 102 TimTheToolMan – see RC’s two “FAQ” posts on climate models. At least generally, they are not tuned to fit climate changes; they are tuned to fit an average climate for some time period. There are some basic underlying physics that constrains the possibilities even if the tuning is wrong; some parameterizations can be based on observations of processes occuring on shorter time periods (as I recall, weather).
TimTheToolMan said: “How about you ’splain to me what that has to do with a discussion centred on the risk associated with using model results to build our science upon?”
There is absolutely nothing other than models upon which to base science.
All science is models. Indeed all human understanding is models. Hawking does a brilliant job explaining this in “The Grand Design” with Leonard Mlodinow, (Kindle location 411) . This is a brief snippet. Hawking discusses this at some length. Models are in no way unique to climate science.
“There is no picture- or theory-independent concept of reality. Instead we will adopt a view that we will call model-dependent realism: the idea that a physical theory or world picture is a model (generally of a mathematical nature) and a set of rules that connect the elements of the model to observations. This provides a framework with which to interpret modern science. “
Comment by John E. Pearson — 22 Oct 2010 @ 9:46 PM
A new denialist argument I heard the other day floored me: that “water vapour was dropping”, which is contrary to the little I know about climate science.
What I’m more interested in is **why** the Paltridge 2009 paper shows what it does. It’s partially covered on that post:
“So why does Paltridge 2009 show decreasing humidity? The authors of Paltridge 2009 themselves point out the well-documented problems with radiosonde humidity observations in the upper troposphere. Comparisons of Paltridge 2009 with satellite measurements (NASA’s Atmospheric Infrared Sounder – AIRS) find the Paltridge 2009 reanalysis has large biases in specific humidity in the tropical upper troposphere. Additionally, Paltridge 2009 doesn’t show any large increase in specific humidity during the 1998 El Niño. Direct measurements indicate the tropical atmosphere does indeed moisten during El Niño events and such moistening is seen in the other reanalyses.”
Was there a post on RC that I may have missed on this topic, either the radiosonde issue or the datset in question? Have there been any follow-up literature to this paper that goes into detail on the dataset?
@Jim : “It doesn’t contradict what I said, nor prove your point about their singular importance in climate science.–Jim”
You’re right of course and my previous comment wasn’t really responding to your point. So I agree that not all (climate) science relies on model results. However conversely how many papers produced these days rely on models? Answer (that I dont think would be disputed) is a lot!
So my main point stands that if something was ever found to be seriously wrong with the models or our understanding of the climate that has been built into the model significantly changes then a lot of science either collapses or at least becomes questionable.
The answer to this is not simple. I’m not offering a solution simply putting the idea up for discussion.
Unfortunately it seems most people here cant grasp the idea of risk and concepts of risk mitigation. Then there are others who simply dont believe its possible for the models to be “wrong” in the manner I’m describing.
[Response: Tim, all systems in which manipulative experiments are impossible rely on a symbiotic interplay of models and observations. Yes, in any system if you find a serious flaw then it’s of course going to weaken the power/skill of your model and send you back to the computer. The question is, what’s the likelihood of that happening, given the sum total of current knowledge. And to conclude that the science necessarily “collapses” or “at least becomes questionable” is not warranted without the specific details of what exactly you got wrong.–Jim]
Jim : ” Yes, in any system if you find a serious flaw then it’s of course going to weaken the power/skill of your model and send you back to the computer.”
Yes, but the risk multiplies when you take a previous step as unverified but “works in the model” and build on it for the next step. If and when the model fails it would potentially “take with it” all the steps back to the original. Such is the risk of doing unverifiable science.
[Response: Whoa–big misunderstanding right there. I didn’t say unverifiable, I said non-experimental. There’s an enormous difference. You do understand that, right?]
Again, I need to stress that this is in the very nature of using models and there is no simple solution but it needs to be addressed in some manner because it doesn’t seem to be at the moment.
[Response: Very hard to follow exactly what you don’t get. Your last clause is way wrong–how and why models are used has had a huge amount of attention paid to it.–Jim]
Jim : “[Response: I know that–and it’s a misinterpretation of what I said. Do I agree that model results are unverifiable? Of course not! You don’t mean to tell me….do you?-Jim]”
No, what you said had nothing to do with my reply. I didn’t misinterpret you at all. I had in fact taken it for granted that model results were unverifiable but clearly you disagree.
Frankly I find that idea bizarre when you yourself essentially said they couldn’t be measured. I mean you cant measure the result from the idea that CO2 doubles therefore the result cant possibly be verified.
[Response: Tim this is sort of a classic example of why there’s so much confusion among the public. I said no such thing about measuring anything–you misunderstood what I said, because you appear to have certain preconceptions about what models are, and do. Since you go back and forth between generic and specific uses of the term “model” it’s impossible to follow at what level of generality you are talking–but overall you seem to be saying that all modeling is a flawed endeavor. The idea that model results are inherently “unverifiable” is just flat wrong (and that not even accounting for the difference between verification and validation, two very different things–you are actually referring to validation I assume). Maybe some others can enlighten you better than I.–Jim]
“Finding something wrong in a useful model isn’t a problem, it’s a development.”
…because my point has obviously been lost. The problem isn’t a “something wrong” that means the model needs a “corrective tweak” the problem is that something major is found that breaks it. And much of the science that relied on it along the way.
To Jim, ” overall you seem to be saying that all modeling is a flawed endeavor.”
Of course. All GCM’s are flawed. But that doesn’t mean they aren’t useful. There is an enormous difference between a flaw that is a not quite perfect simplification vs a flaw that is a fundamental mis-modelled process.
Possibly I mean that models cant be validated rather than verified although I probably mean both. But if I confused you having my terms mixed, then I apologise. At the end of the day, its semantics though because you appear to understand my point.
[Response: No, those particular terms aren’t the problem–I knew what you likely meant. The point is that unless you are evaluating a purely mathematical construct uninformed by anything in the real world, the idea that models cannot be validated is wrong–otherwise what would be the point in developing them? Yes, it’s of course implicit that you will want to validate only certain key concepts, not every detail. A key point here relates to the types of evidence that really “count” toward the overall validation–and that requires an intimate understanding of the model’s structure/function in relation to the suite of available observational data. That’s why Gavin fired off a list of things that GCM’s have predicted ~ right. What’s the chance of that happening if basic physics are not represented correctly? That, without including the longer term (paleo) data that specifically informs your earlier questioning about evidence related to the effect of CO2 changes–Jim]
We aren’t being told the truth by the standard media. I am getting nervous. Food will be a cause of war if there are any bombs or people left by that time. Dr. Dai’s article should be immediately pointed out to the Senate. We are in a really tangled mess. Fossil fuels are at the heart of both the the wars in “Pipelineistan” and GW. Getting off of fossil fuels ends both tangled webs, except for grudges.
If Homo “Sap” survives this century, it will be by the skin of his teeth.
Comment by Edward Greisch — 23 Oct 2010 @ 12:22 AM
Jim : ” The point is that unless you are evaluating a purely mathematical construct uninformed by anything in the real world, the idea that models cannot be validated is wrong”
Well I guess thats the point of this discussion. There is no point in continuing if you are going to disagree that there is any possibility that the models are significantly flawed.
Jim writes : “A key point here relates to the types of evidence that really “count” toward the overall validation–and that requires an intimate understanding of the model’s structure/function in relation to the suite of available observational data.”
Surely it also relates to the fact that the model is correct in the first place. It doesn’t matter two hoots if everyone agrees that process X is modelled correctly if process X doesn’t apply in nature.
In my opinion there is a very real possibility that the models are seriously flawed simply because the climate processes are so complex and many of them are not yet well understood.
[Response: Hopeless. You don’t get it and don’t want to. On to other conversations we thankfully go. My mistake everyone, I thought this guy was serious.–Jim]
they measured this radiation as it bounced back to earth from the moon. So the direction of this radiation was: sun-earth-moon-earth. Follow the green line (for CO2) in fig. 6, bottom. Note that it already starts at 1.2 um, then one peak at 1.4 um, then various peaks at 1.6 um and 3 big peaks at 2 um. You see these peaks all back in fig 6 top.
This paper here shows that there is absorption of CO2 at between 0.21 and 0.19 um (close to 202 nm): http://www.nat.vu.nl/en/sec/atom/Publications/pdf/DUV-CO2.pdf
There are other papers that I can look for again that will show that there are also absorptions of CO2 at between 0.18 and 0.135 um and between 0.125 and 0.12 um.
We already know from the normal IR spectra (chemistry!)that CO2 has big absorption between 4 and 5 um.
So, to sum it up, we know that CO2 has absorption in the 14-15 um range causing some warming (by re-radiating earthshine) but as shown and proved above it also has a number of absorptions in the 0-5 um range causing cooling (by re-radiating sunshine). Clearly, this cooling happens at all levels where the sunshine hits on the carbon dioxide same as the earthshine. The way from the bottom to the top is the same as from top to the bottom.
So, my question is: how much cooling and how much warming is caused by the CO2? How was the experiment done to determine this?
All things being equal as to weather and water, I am trying to establish the net effect of the radiative warming and radiative cooling of CO2. Bear in mind that the warming carries on 24/7 and cooling for 12 hours per day.
I hope therefore to see some results in W/m2/m3 air with —ppm CO2/24 hours, both for the warming and the cooling.
I read “atmospheric CO2: principal knob governing earth’s temperature”
In your experience you are zeroing all the non condensable gases and all the aerosols.
Why all the aerosols?
The aerosols have a GH effect OK but they have also an effect on the clouds (is this the reason of the sharp rising of the cloud cover?)
but what is the effect of this rising on the cloud feedback effect and finally on the temperature?
Can you read the article on my blog and the replies?
Thanks for the response, I know that the paper is crap… I have written more about it in Swedish. However, the “skeptics” here in Sweden had a call out in newspapers about new articles that proved IPCC wrong… this is the only one of those that have not been refuted in a scientific journal (lindzen was an other).
I understand that it seems like a waste of time for the top researchers to refute thees kind of silly papers, but it is the easiest way to stop the “sceptics” to reach media (at least in Sweden and other parts of Europe)… and also to later show the journalists, look here now that was what i told you, are you going to trust them again?
Re Septic Matthew #16, Mojave Desert: Nice pics, indeed, but the map at the end isn’t very useful. The desert has an area of 25 000 sq miles (65 000 sq km). What percentage of that does the proposed power plant occupy? How is this an environmental disaster greater than the extra CO2 we would pump into the air if we didn’t build the extra solar capacity?
I think possibly the fact that TimTheToolMan doesn’t grasp is that ALL models are “wrong” (as Hank says above). As a matter of necessity. They wouldn’t be useful if they were full scale replicas of the real world, because we wouldn’t be able to run them or do anything useful with them.
All models make some simplifications, they have boundaries beyond which they don’t operate. Models (model realisations, at least) must be finite.
And scientists know about these limitations. It’s fundamental to how models are used and the results interpreted. Knowledge of these limitations is absolutely key to getting useful, solid results from models in spite of their limitations.
So it’s no wonder scientists have little patience with people who say “but what if the models are wrong?”
Saw Gavin quoted briefly in the Scientific American article on climate change. (Interestingly this issue had more science writers on the feature pages than usual, as opposed to working scientists who usually fill it). Anyhow, I’d be curious for any expansion on your thoughts of Judith Curry’s work with the “Skeptic” blogs.
Seems to me that while in principle what she says is right, her problem with the IPCC stems from not getting that the IPCC reports aren’t really science documents, in that the whole thing had to be massaged to make many governments happy as it was. The IPCC reports are pretty good, but again, speaking as an outsider, non-scientist, it seems there were a few too many cooks in that kitchen from the get-go (which is, to my mind one reason there were as many errors as there were. I am an editor and when 30 people edit the same thing –even when they have expertise in the subject matter, or perhaps, especially when they have such– the result isn’t always pretty). But maybe I am completely wrong, so if so, tell me.
Very good Didactylos,
I agree with for once. Models, by definition, are a simplication of the whole. The models are as good as our understanding of what is being modelled, and future models will change with greater understanding.
Not everyone’s climate models are the same. Hence, we get a bell type curve of climate sensitivity with a hump between 2 and 4 (roughly). Several models fall on the tail ends of this curve. Obviously, not all these models can be correct, and the sensitivity could change as conditions change. Individually models arrive at a concise value, but the collective is often displayed as a curve. Of course the curve can be influenced by the choice of models to include, but that is all part of the science.
“TTT: Why can’t you specifically state what you think is flawed about specific models?”
Because I’m not trying to point to a particular flaw.
The discussion I’ve been wanting to have surrounds the implication to climate science if a significant flaw is found and the risk to science that the increasing reliance on models has created. It seems that discussion cant move beyond the start gate on this.
Perhaps its my fault for not driving it more but I was really interested in what other people thought without me influencing too much.
Well, TTT, suppose the GCMs are “flawed” (or blind to a yet undiscovered factor) as to dramatically underestimate the amount of AGW, and that the observed result under non-mitigation will be to eliminate most life on earth in an unexpectedly short time. What do you propose? You see, most “the models are flawed” proponents that we encounter assume that the models overstate the AGW threat. If you know models as well as you appear to state, then you know that underestimation probably is as likely as overestimation, however likely those are. Following that line appears to lead to this: (i) the GCM consensus tells us to ditch fossil fuels in an accelerated fashion; and (ii) an AGW underestimation tells us to ditch fossil fuels in an accelerated fashion. Augment those with the reality that fossil fuel supplies are finite, dwindling, and increasingly expensive. That would seem to lead to one bright conclusion: we should begin replacing fossil fuels with renewables in haste, and use part of the probable carbon budget to fuel that build-out. We’re going to have to do it eventually anyway, so why succumb to the carbonites’ delay-for-profit propaganda now, and why wait until fossil fuels are prohibitively expensive? As much as the kooks like comfort, things ain’t as rosy under the current carbon regime as they would like to believe. For those who think the GCMs’ output will be nullified by the coming of the Rapture, or the arrival of aliens bearing “To Serve Man” books and bestowing fusion technology upon us, I suggest stopping their “Cheap Domestic Beer and Mystery Science Theater 3000″ diet.
…the idea that they might be seriously flawed is irrelevent and unnecessary.
I’m with you. They just don’t get it.
I mean, what if some day someone discovers that the human race doesn’t even exist? Why do they discount that possibility as if it’s irrelevant? If the human race doesn’t exist, there’s no way we could be causing climate change. It rips the A right out of AGW.
And yet people ignore this obvious fact. The fanatical alarmist believers want to take moderate but substantive steps to reduce CO2 emissions and forestall what appears will be a civilization altering event in the next fifty to one hundred years.
Paleoclimate studies, atmospheric physics, direct temperature observations, myraid indirect physical and biological observations resulting from temperature changes, and dozens of different, independent, complex climate models all agree on and point towards that same outcome.
But if humans don’t exist, none of that matters!
If I don’t exist, why should I be asked to change my behavior? Why should I in any way ameliorate the situation by impinging on an almost magical, fairy tale, unnecessarily pampered lifestyle that I enjoy, albeit almost certainly at the horrific expense of my children and grandchildren?
People just don’t get it. It’s so obvious, it’s as plain as the nose on their non-existent faces.
Dan H.: Models are just one source of estimates for climate sensitivity. If you look at this figure of climate sensitivity, you will see that the best estimates from multiple disparate lines of exploration all cluster around the 3 degrees values.
What is the probability that all these completely different approaches have unrelated errors that all act in the same direction? Nope. It’s not likely. As research progresses, I expect to see the uncertainty narrow.
We must act on the basis that the best estimate is likely, and that lower or higher values are possible. We must not act on the hope that the likely value is the lowest and everything else is wrong. That would be foolish, and terrible risk management.
TimTheToolMan said: “The discussion I’ve been wanting to have surrounds the implication to climate science if a significant flaw is found”
So why have you studiously ignored my comments (and those from others) that address this very question? Could it be that you don’t like that answer, and want to cling to a fantasy of huge but unsuspected errors?
That’s a nice conspiracy theory, but it just doesn’t fly in scientific circles.
Please go back and read my post at #40, Chris Colose at #24 and Ray Ladbury at #29.
aphillips 68: What about the weak early solar paradox, or whatever it’s called? Lindzen mentioned this. Apparently, the sun was something like 25% weaker, but the earth was not frozen, and therefore climate sensitivity is weak. What’s the story on that?
BPL: The “Faint Young Sun Paradox,” first explicitly stated by Carl Sagan in 1972. Yes, the sun was about 30% dimmer 4.5 billion years ago, so early Earth should have been frozen over. But geological data (e.g. 4.4 GY old zircons) indicate open water on the surface.
The paradox is resolved if greenhouse gases were much higher in the earlier atmosphere.
TTTM, significant flaws don’t particularly surprise or bother anyone.
Models aren’t expected to be all that good to be useful — they’re gross approximations.
Our atmosphere isn’t like a “slab model” — like a stack of pancakes, each one a little fluffier than the one below it. But the models work.
Say we find some mysterious unknown physics is causing the warming we see.
That would be a flaw. Well, but the models have been working, so we then look for a second mysterious unknown that has been cancelling out the warming we know comes from CO2. Or a handful of others.
You find one flaw in a model, the model keeps working, so the odds are you will find another flaw that more or less counterbalanced the first flaw.
You find a ten cents difference in your bank balance and a thousand dollar error in a check deposited, you have to keep looking.
You might find one countervailing error to explain the $9,999.90 difference. More likely you’ll find smaller errors in various directions before you get your checkbook model of your bank balance to be an exact match.
But your checkbook model was useful even with the errors.
You’ve got “overthrow the founder and the whole thing collapses” as your model of reality.
That model isn’t useful — for science.
It works for religion. It works for politics. It works in the schoolyard.
It doesn’t work in science. Science works with flawed approximate models because that’s what people can make. We don’t duplicate reality.
All models are flawed. Some models are useful.
Look that up. Read about the guy who said it.
…the implication to climate science if a significant flaw is found and the risk to science that the increasing reliance on models has created.
Ditching my previous sarcasm, the problems that everyone has with this are:
“…the implication to climate science…” are virtually none. Models are one small part of the science, and are themselves built upon myriad other aspects of science (physics, chemistry, biology, etc.). No one is going to find a flaw that suddenly, dramatically overturns the model results. It’s a denial fantasy, and ridiculous to even consider discussing.
“…if a significant flaw is found…” which is so unlikely as to be insignificant. That fact that you have no specifics, and are just thinking of a completely intangible hypothetical is the first clue. This is a huge if, which you offer with no ideas or error bars whatsoever. There’s no reason to consider it.
“…increasing reliance on models…” is a silly denial meme with no foundation in fact. You’ll pick it up from WUWT and other denial-echo-chamber “sources” where it’s taken as incontrovertible truth only because everyone there repeats it over and over. Do a google scholar search on climate papers and look at how many there are, and how few of those are based on models. The number and complexity of interwoven disciplines and diverse areas of study in climate science is mind boggling, and to claim something like a “reliance on models” is absurd.
If you stop trying to convince everyone that you know something they’re too silly to realize, and instead invest time in actually studying the science and the huge volume of information available on the subject, you’ll be a lot better off.
Tin Man 117: the problem is that something major is found that breaks it. And much of the science that relied on it along the way.
BPL: Except that nothing of the sort has been found. And if it were found, you’d have to explain how the models have “wrongly” gotten so many predictions right. All the big, glaring flaws just happened to cancel out? What are the chances?
“if a significant flaw is found and the risk to science that the increasing reliance on models has created”
Your faulty assumption is that somehow those silly scientists are just blindly modeling their poor little hearts out without thinking about the direction their epistemological proclivities might be taking them. If you don’t get from the responses so far that this isn’t the case, perhaps you could spell out what you would consider to be convincing evidence that they actually know what they’re doing. Then you could be provided with an answer that you could either verify for yourself (given sufficient energy on your part) or you could prove once and for all to all and sundry what fools they’ve been for not bowing to the wisdom of the wing-nut-o-sphere. No goal post moving allowed.
At least it would serve as a point of conversation. Otherwise it looks like you’re just aimlessly messing about, fishing for something that doesn’t exist, but hoping you’ll get lucky.
henry P 120: I am trying to establish the net effect of the radiative warming and radiative cooling of CO2
BPL: Pick up a copy of John T. Houghton’s “The Physics of Atmospheres,” read through it, and work the problems. Then do the same with Grant W. Petty’s “A First Course in Atmospheric Radiation.” That should give you a pretty solid foundation in this.
Tin Man 131: The discussion I’ve been wanting to have surrounds the implication to climate science if a significant flaw is found and the risk to science that the increasing reliance on models has created. It seems that discussion cant move beyond the start gate on this.
BPL: Why do you think that is? Sheer meanness on the part of the posters here? Or do you think there might be something flawed about “The discussion [you’ve] been wanting to have” itself?
Example: You go to a convention of Egyptologists and tell them, “I’d really like to have a serious discussion on how it would impact what we know about ancient Egypt, if it turned out that aliens built the pyramids.”
TTT: “Because I’m not trying to point to a particular flaw.
The discussion I’ve been wanting to have surrounds the implication to climate science if a significant flaw is found and the risk to science that the increasing reliance on models has created. It seems that discussion cant move beyond the start gate on this.”
How can we discuss “a significant flaw” without any qualification as to what that flaw might be, what effects it would have or what portions of “the science” it would affect? Beyond, that is, saying “well, scientists will look at the affected portions, and figure out the implications”–which is more or less what someone already said upthread.
Comment by Rattus Norvegicus — 23 Oct 2010 @ 10:56 AM
Tim@131, The problem with your question is that it is entirely too broad. The implications of “a problem” depend entirely on exactly WHAT the flaw is. There is far more likelihood that a mistake would raise sensitivity estimates that there is that it would lower them.
The thing is that we know that the planet is warming well outside of anything we’ve seen in the record. We know ice is melting. We know it is placing strain on ecosystems. Without models, we are flying blind, and the only responsible thing to do would be to put the breaks on HARD.
What you seem to fail to comprehend is that climate models have been astoundingly successful in terms of both explanatory and predictive power. Because these models are based on physics rather than a mere statistical fit, it would be very hard to achieve such agreement erroneously.
Now, again, why do you think a failure of the models would be of benefit to the denialist side?
As to your statement: “Unfortunately it seems most people here cant grasp the idea of risk and concepts of risk mitigation.”
Not only is this a totally arrogant statement, I’m at least somewhat confident that it is wrong. I think many here are examining risk mitigation in their considerations pertaining to climate change. . . or were you referring to something else other than risk mitigation?
Another fatal flaw in your reasoning follows:
Let’s say you are right and all the models, maths, physics are completely wrong, or even mostly wrong. That leaves a problem. It would mean that the earth is warming and we don’t have a clue why. . .
. . .unless of course you have an alternative theory? Maybe a large alien ship cloaked above our atmosphere with a large infrared gun heating our planet?
Please do enlighten us as to how the temperature trend has started warming with no apparent explanation, because all our models are wrong?
Please hep me out on this one. I’m desperate for your qualifiable answer that explains this?
Someone has to pick up the baton here. If a serious flaw was discovered, obviously the model would have to scratch many of the results. It is possible that the models which currently make up the tail ends (either higher or lower) could then become the norm. Alternatively, even these could be flawed in that the contained the error, but in a lesser contribution.
Just because the model would be flawed, does not mean that we would have to throw out all we know. In the pyramid case, they were still used as tombs. The political implication would probably be much greater than the scientific, and science would simply discard them and move on. Politically, it could be a death sentence, due to the heavy reliance on the models.
We will see if people respond with enthusiasm or meanness.
[Response: This is fantasy land. The probability that something was so wrong that everything would be tossed is infinitesimally small, and combined with your fervent desire to have all understanding about climate change risks be related to models (it isn’t), you have a recipe for unhinged speculation that is completely pointless. – gavin]
TTTM. Just had another look at your questions and everyone’s responses.
I think you’re showing signs of the “house of cards” problem, total collapse if something is found to be wrong. But science isn’t a house of cards, it’s a jigsaw puzzle. Someone can pick up a piece and say look, this will only fit if those red (or yellow or blue) pieces belong to the car in the background not the flowers in the foreground. If they’re right, a handful of pieces have to be re-fitted into the right places. The edges, the trees, the boats on the water are all absolutely fine. The picture as a whole won’t change even if some features become more or less prominent.
The house of cards approach would mean that the whole thing has to be done all over again. The jigsaw approach just means that we have to re-examine the parts that look a bit odd.
To Henryp (120) – Atmospheric greenhouse gases absorb and emit in the near-infrared and UV, warming the atmosphere but reducing the energy transmitted to the ground. The effect is small compared with the “greenhouse” forcing due to absorption of terrestrial radiation. More importantly, it is dominated by water vapor, with CO2 contributing a much smaller percentage. A useful quantitative source is the Kiehl/Trenberth paper, which includes the now famous energy budget diagram, at Earth’s Energy Budget
Henry@Barton Paul Levenson (on question 120):
I am looking for answers to two simple questions, you advise me to buy two books.If you have the answers in the right dimensions, why don’t you give it. If you admit that you cannot provide me with the answers, then you have to feel very bad (if you are an expert on CO2).
( I am just a simple chemist living in South Africa trying to figure out what is wrong with my carbon footprint).
I think you don’t understand the difference between the confidence in the detectable signals that are robust in nature and have relevant verifiability in multiple methods or disciplines, and the noise, where the models are clearly not yet able to discern all the minor signals between those signals and the various processes that are involved.
Your what if scenario is a dream, a fantasy, a red herring. You have no scientific basis for it, it is a dead parrot. It’s not sleeping, it’s not pining for the fjords, it’s a dead parrot. It’s bleedin’ demised, expired, no longer alive. You can wave it in the air. You can through it so as to simulate flight but you can have no scientific basis for pretending it’s alive.
“the implication to climate science if a significant flaw is found and the risk to science that the increasing reliance on models has created.”
Then you adjust the models and go forward. What else were you expecting?
As for the risk, it’s a moot point. There is no magical “increasing reliance”. Climate science is not an experimental field, so they’ve always had to use models, whether it was a computer simulation or some guy with pencil and paper doing calculations to try and figure out what happens. Even a mathematical equation describing the behaviour of a block of wood sliding down an incline is a model in the broadest sense. You can’t say there’s an increasing reliance on something that has always been in use.
henryp, (@151 or so) your actual question earlier was:
“So, my question is: how much cooling and how much warming is caused by the CO2? How was the experiment done to determine this?”
Perhaps Barton’s response is a clue that the answer isn’t as simple as you imagine? Folks often ask for “the paper” that proves AGW, or the answers this or that question on the topic. Many times it turns out that there’s a whole lot more past work that has gone into the topic than any of us amateurs could have imagined.
Now, I don’t have the answer to your question; and, since I don’t even “play an expert on TV” I don’t feel bad about that. But it sounds as if you’re interested in results at a particular location, not some kind of global average, and I do know that the contribution of CO2 is going to vary a lot based upon location, time of day, seasonal variations, and the weather conditions (especially cloudiness.)
Personally, I think Barton’s response was potentially pretty helpful–and I’m certain it was intended as such.
By the way, the data you were using echoes what Svante Arrhenius used way back in 1896, funnily enough:
Global Warming Science And The Dawn Of Flight: Svante Arrhenius
That is not an adequate response. Water and air will both be warmer, upper atmosphere cooler, and the land/water difference will be greater in summer so the winds (e.g. from the Sahara to the N. Atlantic) will be greater and more dust-laden. It is not clear that any prediction follows from all that.
If the number of cyclones stays the same on average but the number of high-intensity cyclones increases, then the average ACE will increase, and the change in average ACE will be as detectable as the change in average temperature.
After hurricane Katrina, and again a few days ago from a Swiss re-insurer, there were warnings of increased frequency of major cyclones. The warning from the insurer was that New Orleans should improve its land use, including major reforestation. Like much good advice, that is good advice whether the frequency of major hurricanes does or does not increase.
If someone has a comprehensive and well written exposition of how increased temps are supposed to lead to increased/unchanged/decreased cyclone intensities, I would like to have the link. Nothing I have found on my own to date includes all the factors.
Readers here must have seen (or I can retrieve a link) the report that land-use changes have resulted in reductions (10% – 15%) in the average speeds of recorded winds around the world. Is that not a surprise?
“Cryo-Hydrologic (CH) warming is proposed as a potential mechanism for rapid thermal response of glaciers and ice sheets to climate warming. We present a simple parameterization to incorporate CH warming in thermal models of ice sheets using a dual-continuum concept, which treats ice and the cryo-hydrologic system (CHS) as overlapping continua with heat exchange between them. The presence of liquid water in the CHS due to surface melt leads to warming of the ice. The magnitude and time-scale of CH warming is controlled by the average spacing between elements of the CHS, which is often of the order of just 10’s of meters. The corresponding time-scale of thermal response is of the order of years-decades, in contrast to conventional estimates of thermal response time-scales based on vertical conduction through ice (∼102–3 m thick), which are of the order of centuries to millennia. We show that CH warming is already occurring along the west coast of Greenland. Increased temperatures resulting from CH warming will reduce ice viscosity and thus contribute to faster ice flow.”
The Gordon Bonan paper is interesting, and it might seem to support the argument here made at realclimate by a commenter against forestation in deserts. That argument asserted that the albedo change caused by forests would cancel heat balance benefits of the CO2 sequestration.
Bonan’s statement refers to Northern Boreal forests which would be replaced by snow, were they to be destroyed. Thus, the albedo of snow would be compared with that of forests and the conclusion, “The climate forcing from increased albedo may offset the forcing from carbon emission so that boreal deforestation cools climate (8)–” can stand.
The commenter does not show awareness of the integrative effect of forests, since the amount of CO2 sequestered in the forest increases with time and albedo change is a roughly one time change.
Bonan may understand this since he is discussing fully formed forests that might be mowed down at one time, so the CO2 dumped into the atmosphere is a one time event and so is the albedo change. However, Bonan is talking about snow, not dirt.
For the massive forest project I am discussing, the albedo change would be roughly a one time event as the trees first shade the ground. From that point on, the addition of forest mass is an increment each year that sequesters an increment of CO2.
[Response: No. A change in albedo affects the radiative forcing as long as it remains in the changed state. It’s not a “one time event”. As for the statement on “dirt”, the albedo of much of the arid lands is pretty high, due to low vegetative cover, low albedo of seasonally dry grasslands, and/or low organic matter in the soil. Such areas will have large albedo changes if forested, especially to conifers. You also have to consider the effects of increased evaporation (cooling of the land surface but who knows what effects due to water vapor and clouds). Bonan’s take home point is that these conversions ain’t so simple to evaluate as might first appear.–Jim]
I wonder if Bonan was including the loss of the underground root structure in his statement, that is, the work referenced by his statement. If not, the reference might be very much in error.
The study offered by link here said 80% of carbon is in root structure, and if that is so, my calculations could be showing more land required than would be needed.
Sorry that I am not keeping up with this very well, but I will try to find that link, regarding work in Tibet if I recall correctly.
“In my opinion there is a very real possibility that the models are seriously flawed simply because the climate processes are so complex and many of them are not yet well understood.”
This seems to be his main point. But unless he comes up with something specific which might be a flaw, there really isn’t any way to deal with this complaint. He seems to be arguing that since it is all so complicated, one should be wary about relying on models which you can’t be completely sure about. And it seems to me that this is really what he wants to discuss.
Okay, let me give it a try. Climate is very complex and we don’t understand it as well as we would like. So we should not be doing large scale experiments on the atmosphere, the results of which we can’t predict with any certainty. But adding to the atmosphere greenhouse gases in the amounts we have been doing, and seem determined to continue to do, is just such an experiment. Anyone who is conservative about fiddling with our world in dramatic ways should be worried about that.
I try to hit the main points of the water and tree project in the following:
Standing Forests Solve Global Warming At No Cost
The game winning answer to global warming is to create standing forests, where every ton of newly existing forest mass, on a sustaining basis, compensates by CO2 capture for the burning of a ton of coal, approximately. Key to this solution is distribution of water in North America on a continental basis.
I have been dismayed by promotion of electric vehicles with implicit increased use of electricity and the associated increase in CO2. Viable, large scale solutions to this problem have been absent. But I have been shocked by the planning put forward by the US EPA ** regarding ‘carbon’ capture and sequestration (CCS), where the capture cost burden per ton of coal used would be up to $180-$320. This would be for capture of CO2 only, with additional costs for transportation and pumping it into caverns being not addressed, but acknowledged as additional expense.
Thus motivated, I looked for a better solution, and found that China seems to have taken the lead over our environmentalists in this very practical matter. A year ago, in a speech about how China was planning to react to the global warming problem, President Hu spoke of “forest carbon”. ***
It is not a big step to think that this kind of solution would be possible in North America, Brazil perhaps, and other places yet to be identified. It is a big step to think big about water distribution that would be needed to accomplish CCS on the needed scale, but in North America this is within reach, with the action of wise government assumed. Of course there would be a need for due diligence in protecting Northern ecosystems, as well as due deference to rights of others. The goal of CO2 mitigation is not just our concern, so there would seem to be motivation for Canada to lend their essential support to such a project.
Every ton of forest mass, that exists on a sustaining basis, sequesters CO2 sufficiently to compensate for the burning of a ton of coal, approximately. As it grows, it captures that CO2 from the atmosphere. Mature forests must be maintained and harvested wisely, and new forests must continue to grow.
Using minimally productive land in selected regions, a fifty year project should be possible, where fifty years of coal fired power plant operation would be supported. In this time we would need to solve the problems of nuclear waste, so that there could be an eventual transition to that form of energy. During this fifty years, we would also need to work toward minimizing the amount of energy needed for our vehicles.
This forest project, along with ancillary agricultural development, would be quickly self supporting. We know about the agricultural results from the latest California Aquaduct project implemented in 1963 through the California Central Valley. The forest part would be something new.
The immediate benefit of such a project would be high quantity job creation, but up front investment in the permanent forest infrastructure would be repaid over the long term of highly productive operation. A large cadre of trained workers for forest management, a large expansion of agricultural operations, and a long term flow of export products would lift us from our current employment debacle.
We see this as a public project that should appeal to all political strains, since it would create a backbone infrastructure that would set the stage for use of energy to continue functioning of our developed world without damage to the global environment.
Implementing such a concept would require much detail in its actual design, but feasibility in general is not in question. This would be a massive federal project that must be handled by government, both in regard to international water negotiations and financial arrangements.
Is there a political force that can handle such a project?
** The announced plan by the EPA is to require ‘best available technology’ and the recent report by them (Sept 2010) said ‘carbon’ capture would cost up to $95 per ton of CO2. Working this out in terms of the burden on the use of a ton of coal shows that the burden for use of a ton of Powder River Basin coal (half the element carbon by weight) will be about $180 per ton of that coal, and higher carbon coal would incur proportionately higher burden, up to around $320 per ton.
[Response: OK, you’ve now repeated this mantra several times. As I and many others have mentioned, there are enormous potential ecological, hydrological, economic and social issues involved in such grandiose geo-engineering schemes, which you gloss over or ignore, most of which I highly doubt you are even aware of. The history of land and ecosystem “management” is littered with the residue of the unforseen consequences of similarly ill-conceived schemes which appeared, at the time, to be simple, “can’t fail” ideas. Ecosystems are exceedingly complex things that defy easy analysis or prediction, and make no mistake, you are talking about massive rearrangements to the earth system with almost entirely unknown consequences, logistical issues, feasibility questions etc. No more pie in the sky proposals sans detailed discussions of biophysical limits, system effects, costs, etc., supported by defensible reference to the scientific literature.–Jim]
Absorption of solar radiation within the atmosphere would cool the surface, but if it occurs below the tropopause, it still brings heat below the tropopause – it would tend to reduce convection without much effect on surface temperature (on average). Absorption above the tropopause has a cooling effect, but the effect is reduced by the increased downward LW flux from the upper atmosphere that occurs in response to warming that is due to the solar heating there. Absorption of radiation that would otherwise have been reflected to space reduces albedo and thus results in greater solar heating.
The surface and troposphere are generally coupled by convection that tends to maintain some relative temperature distribution, so radiatively forced warming or cooling at any vertical level below the tropopause tends to have an effect that is vertically spread out from the surface to the tropopause. Thus, surface and tropospheric temperatures tend to respond to changes in the net radiative flux (LW+SW) at the tropopause level (changes in the incoming and outgoing energy). The stratosphere responds to radiant heating or cooling of the stratosphere, which is the difference between the fluxes at the bottom and top of the stratosphere.
The LW effect of CO2 dominates over the SW effect of CO2 by a relatively large margin – sorry I don’t know the numbers for CO2 offhand; there is a paper I know of which could help and I’ll try to find it.
Though CCS technologies exist, “scaling up” these existing processes and integrating them with coal-based power generation poses technical, economic, and regulatory challenges. In the electricity sector, estimates of the incremental costs of new coal-fired plants with CCS relative to new conventional coal-fired plants typically range from $60 to $95 per tonne of CO2 avoided (DOE, 2010a). Approximately 70–90 percent of that cost is associated with capture and compression. Some of this cost could be offset by the use of CO2 for EOR for which there is an existing market, but EOR options may not be available for many projects.
The curious failure of this study is that the cost is in terms of tons of CO2, rather than tons of coal, so that on quick reading it might seem tolerable. (See footnote of my 4:15 comment.)
In other settings these are called bypotheses, often with parameters to be adjusted for best fit to the data. That is, the data is treated as signal to be explained by the hypothesis plus random noise. Fairly obviously, a model with more parameters ought to give a better fit in trade for a more complex hypothesis, there being more parameters.
To resolve this tension between fit and the complexity of the hypothesis, there are various information criteria that are used to determine a \best\ hypothesis. For example, the http://en.wikipedia.org/wiki/Akaike_information_criterion
which penalizes parameters less than other popular information criteria.
For the analysis of some time series, just this suffices to select the \least wrong\ (usually called \most informative\) model.
Comment by David B. Benson — 23 Oct 2010 @ 6:33 PM
156 Maya said “Even a mathematical equation describing the behaviour of a block of wood sliding down an incline is a model”
Of course it is, as is measuring the height of a column of mercury in a glass tube and calling the height “temperature” and then claiming that the length of the mercury column is related to what we perceive as “warmth” and also that that same thing, a length called “temperature”, is related to the distribution of speeds of “molecules” that make up what we call a “gas”. Concepts such as “molecules”, “temperature”, etc are all models and they work like gangbusters. Models are what we use to make sense of the data stream that we identify as “reality”. I suppose that most posters here think this is pedantic nonsense. In point of fact there is no way to interpret any experimental observations except within the context of already existing models. To complain that climate science generically relies on models is unreasonable. Questioning whether specific features of the models that we rely upon are adequate for specific tasks is what scientists do and is entirely reasonable.
Comment by John E. Pearson — 23 Oct 2010 @ 6:34 PM
an ideal place for my dumb question that vexes me, if anybody can offer words of wisdom thanks very much.
Some 60% of produced CO2 is sequestrated by natural sinks
The other 40% goes into the atmosphere where it is subject to a complex series of 1/2 lives with some 100 year average.
This ratio is unchanged over recent decades
How does any particular packet (or mole) of C)2 know it’s destined for the A team or the B team
The only explanation I can dream up is that natural sink atmosphere is in equilibrium
in which case either the natural sink is infinitely large or the ratio must change
[Response: I agree with much of what you’re saying, but be very careful with conclusions on the topic of global effects of forests on the climate system. Hopefully we can do a post on the topic because people are clearly interested. In the mean time, everyone interested should definitely read this paper by Gordon Bonan, which has already been cited 168 times in ~ 2+ years.–Jim]
Actually, I do apologize. I had in my head the thought, then forgot it as I typed, of mostly regrowing former old growth forests and replacing single trees in parks and such with food forests, which sequester a lot more biomass. I do recall, for example, that evidence points to a positive feedback, possibly due to reflectivity, from additional tree growth at high latitudes.
Still, in the short term, people planting trees and food forests can only help. It won’t grow into a global mass planting until things are far more dire and is governmentally funded.
James (169) – Your question isn’t dumb. It’s perceptive. The 100 year average you quote is probably an underestimate (see #56), but it makes the point of a long interval needed for elevated CO2 concentrations to subside. The basic answer to your question is that the natural sink is not infinite, but it is very large (particularly the ocean sink), and so even a large input of CO2 into the atmosphere will change the ratio only slightly. Ultimately, however, the capacity of the ocean to absorb more CO2 diminishes, and so the airborne fraction will increase – a disturbing thought considering the consequences of rising CO2 based on the current fraction.
Finally, over many thousands of years, eleatmospheric CO2
Comment by David B. Benson — 23 Oct 2010 @ 7:11 PM
James (169) – Your question is perceptive, not dumb. Natural sinks are not infinite, but the ocean sink is very large compared with the atmosphere, and so substantial elevations of atmospheric CO2 change the ratio only minimally in the short run. Over longer intervals, the airborne fraction will increase as sink capacity diminishes, exacerbating the greenhouse warming due to anthropogenic emissions. Ultimately, over hundreds of thousands of years (see #56), elevated CO2 levels, if unperturbed, return to baseline, accompanied by a return to pre-baseline rates at which oceanic CO2 (in the form of carbonates) exits the climate system into the Earth’s interior via subduction zones. This relieves the oceanic sink of the disequilibrium between uptake and removal of CO2, but it takes a very long time.
Thank you all for your responses to my question regarding the implications of a significant model flaw. Some of the answers were very enlightening and I appreciate your time in considering this as you have.
120 Henry P. Part of your initial coment concerned the radiation path from the moon to the earth. So just in case you think extra CO2 has a significant shielding effect from lunar infrared, an estimate of the magnitude is useful. In visable light the full moon is roughly a million times dimmer than the sun (as seen from earth). IIRC astronomical magnitudes are -26.7 and -12.5, which would yield a ration of: 10.**(.4*14.2). Since lunar albedo is under ten percent, we can bump up the lunar IR luminosity by roughly 10 times, but that still puts it at roughly a hundred thousand times weeker than solar flux. So lunar radiation to the earth is a tiny overall effect.
137 Last I had heard the faint young sun paradox is still not easy to solve. If we assume doubling of CO2 is roughly similar to a 1percent increase in solar luminosity (maybe its more like 2% -but that doesn’t affect the argument much), then we would need approx 30doublings of CO2, which is far beyond anything reasonable. Now greenhouse gases work better in combination, add species with non overlapping absorption bands and it is a lot easier to get a given degree of warming. So we probably had a lot of methane, and perhaps SO2 as well. But I don’t think we have a good handle on the composition of the early atmosphere. But it takes a lot of greenhouse effect to overcome the faint young sun.
169: You were questioning the stability of the portion of anthropogenic CO2 that is retained in the atmosphere versus absorbed in non atmospheric reservoirs. Do the following thought experiment:
Atmosphere, plus a linear by relatively small reservoir, which equilibrates quickly. The total CO2 should increase proportionately in both reservoirs (thats what I meant by linear reservoir total sequestered CO2 is proportional to atmospheric concentration). So small reservoirs with short cycle times would tend towards the observed fact.
SW and LW forcings are calculated by LBL codes (and compared to values calculated in AOGCMs)
But not for global average conditions!
climatological midlatitude summer (MLS) atmospheric profile (temperature, ozone, water vapor – except when water vapor change is the ‘forcing’ being analyzed)
for SW fluxes, surface is a Lambertian (diffuse) reflector with wavelength-independent albedo 0.1; at TOM, incident solar radiation is 1360 W/m2 (this doesn’t include the 6.33 W/m2 flux found at wavelengths longer than 5 microns) at a zenith angle of 53 degrees (Which is then 818 W/m2, and about 2.41 times the global average (aside from the small fraction of longer-wavelength solar flux not included) or 2.4 times the global average whole-spectrum incident solar flux.
for LW fluxes, the surface is approximated as has an emissivity of 1 and a temperature of 294 K.
spectral interval of 4 to 5 mm is contained in both shortwave
and longwave calculations, but it is not counted twice since
the shortwave codes omit thermal emission and the longwave
codes omit solar flux.” (paragraph 15)
from Tables 8 and 9, mean values of the LBL calculations:
instaneous forcings (decrease in net upward flux/area before any climate response, W/m2):
column 1: surface
column 2: 200 mb (near tropopause level**; mb = hPa)
column 3: TOM (top of model, which I assume is also effectively TOA, top of atmosphere)
values derived from first three columns:
column 4: TOM-200 hPa (forcing on the stratosphere, if tropopause is at 200 mb),
column 5: adjusted 200 hPa forcing assuming 5/11*** of the stratospheric forcing is transfered to the 200 hPa flux)
**”The quantities requested from each participating
group include (1) net shortwave and longwave clear-sky
flux at the top of the model, (2) net shortwave and longwave
clear-sky flux at 200 hPa, (3) net shortwave and longwave
clear-sky flux at the surface, and (4) (optionally) net
shortwave and longwave clear-sky fluxes at each layer
interface in the profile. The reason for performing calculations
at 200 hPa rather than the tropopause is to insure
consistency with the radiative quantities requested as part of
the climate change simulations. Since not all modeling
groups are prepared to compute fluxes at a time-evolving
tropopause, the Working Group on Coupled Modeling
(WGCM) of the World Climate Research Programme
(WCRP) and the IPCC have requested fluxes at a surrogate
for the tropopause at the 200 hPa pressure surface. The
precise choice of tropopause can affect forcings by up to
10% [Myhre and Stordal, 1997].” (paragraph 13)
*** – that’s an approximation that I’m using with some justification but absent detailed knowledge (educated approximate guess)(to the extent that most of the emission leaving the stratosphere is at wavelengths where the stratosphere is optically thin, a first approximation can be made that 1/2 of the change in stratospheric emission is in upward TOM flux and the other half is in downward tropopause-level flux; both greenhouse gas-induced stratospheric cooling and SW heating changes should tend to be larger in the upper stratosphere than in the lower stratosphere, while the same temperature change has a greater effect on radiant fluxes when the initial temperature is warmer; thus, for Earth’s stratosphere, the general tendency should be for the tropopause level flux to change less than the TOA flux for stratospheric adjustment. The total emission from the stratosphere leaving the stratosphere now is divided up with about 5/11 going downward and about 6/11 up to space (Hartmann, “Global Physical Climatology”, 1994) – I don’t expect the same proportion to apply to stratospheric adjustment, but I start there in the absence (on my part) of more work/info).
(refering to the effect of stratospheric adjustment to tropopause-level forcing (haven’t checked the source but I’d assume this is for global average forcings): The effects of adjustment on forcing are approximately -2% for CH4, -4% for N2O, +5% for CFC-11, +8% for CFC-12, and -13% for CO2 [IPCC, 1995; Hansen et al., 1997]. (paragraph 11))
Much of the discussion on climate blogs is about things like adequacy of models, outcomes of certain chemical balances in the atmosphere, characteristics of CO2 in the atmosphere and oceans, etc.
I have tried to get an answer to a basic question: Is anybody thinking or planning for a strategy – individual or national – if every effort at reducing green house gas fails (as seems likely I am sorry to say) and we go into that uncharted place where temperature rises to impact man and nations in a big and serious way.
Such as significant and perhaps sudden rise of sea levels due to large ice masses sliding off Greenland and/or Antarctica, or multiple and larger episodes worldwide of what we saw in Russia this summer with extreme heat, fires and loss of 25% of the wheat crop.
Is contingency planning in order for those probable events? Some suggest large impacts of global warming are far off. My gut feeling based on what have proved overly conservative projections of sources like the IPCC is that big impacts of warming will come sooner – perhaps much sooner than we think. To my knowledge we are completely unprepared if large impacts set in suddenly. A significant onset of such events as mentioned above could bring on another related impact. As people and nations feel impacts, industrial activity such as generation of electricity could slow significantly if plants shut down. We had a shut down of air traffic on 9-11 and that small change – fewer planes in our skies – resulted in an immediate 1 degree C rise in temperature worldwide as particulate matter fell to earth.
Does anyone know of significant planning going on for events such as above?
re: effect of absorption of incoming near-infrared by CO2,
and further to Patrick 027’s conceptually helpful discussion at #167,
Browsing a bit, I find references saying near-infrared absorption of CO2 is a significant heat source in the mesosphere. If I understand correctly, the dominant effect on the mesosphere of increasing CO2 is to cool it through increased emission at 15 µm. This mesospheric cooling will be counteracted to some degree by increasing absorption in the near-infrared, but according to Fomichev et al. 2004, not significantly so for a doubling of CO2 at the surface (Fomichev et al. 2004, doi:10.1029/2004GL020324).
As a layman, I can’t assess this finding. And despite Patrick’s helpful discussion above, I’m confused as to what, if anything, this might mean for energy flux and temperature on the surface. But if increased NIR absorption by increased CO2 does not significantly offset increased greenhouse cooling in the mesosphere, where CO2 NIR absorption is an important term, is there any reason to expect it to significantly offset greenhouse warming in the troposphere?
HenryP 153: I am looking for answers to two simple questions, you advise me to buy two books.If you have the answers in the right dimensions, why don’t you give it. If you admit that you cannot provide me with the answers, then you have to feel very bad (if you are an expert on CO2).
BPL: I didn’t “give you” the answers because the answers are not simple. If you want to understand this stuff you’ll have to do some work. Sorry about that.
“I dont want to dis the models too badly. As a tool they’re great. Its just that they’ve now BECOME the science and that is at best too risky to building climate science upon and at worst potentially just plain wrong.”
Even as a non-scientist I know that to be wrong.
Some people put an emphasis on the models because they are a tool that tries to predict based on the ‘science’ programmed into them. There is a fascination for prediction.
There are just to many other areas of science (and ongoing observations) that agree with the trajectory the models predict. Even if you removed models, people would be analysing observations and making probably even more wild predictions!
Which is a thought. Models might actually be a moderating influence on predictions.
BTW I like the word trajectory, I think it is a useful analogy when describing a general direction that is predictable within a margin of error.
Believe it or not, folks here are usually happy to try and help answer a polite question (particularly on an open thread–BTW Thanks, Gavin, Jim et al.).
I think you may be suffering from an illusion that is common among nonscientists and even among some scientiists–namely that the purpose of a model is to yield accurate answers. This isn’t really the case. If we put in a complicated enough model, we can always match a data set–and thats how you get the Ptolemaic universe, astrology and homeopathy.
Rather, the purpose of a model is to yield insight into the phenomenon under study. In general, we know a model is successful if it reproduces the behaviour of the system–that’s the Goodness of Fit. However, even a model that fails to yield accurate agreement may be successful at helping us understand the system.
George Box said, “All models are wrong. Some models are useful.”
It is also important to understand the difference between statistical models (essentially data fitting) and dynamical or physical models. The former are prone to overfitting, so we have to have tools to prevent this (e.g. various information criteria). The latter are very unlikely to yield accurate results if we have incorrect physics. However, dynamical models are only possible when you understand the system being modeled very well. The successes of GCM demonstrate that we understand climate sufficiently well to reproduce the global, long-term behavior of the system. We still have a way to go before we understand the system on short timescales or regionally. Hope this helps.
Organizations with planning authority largely see global contingencies as someone else’s problem. There is no global authority and national and regional authorities are stuck in a nationalist morass. This is not a problem specific to climate issues.
States such as the Netherlands which are particularily vulnerable are more prone to responsible planning. I recommend you consult the Dutch general planning report aimed at the public.
Organizations in chage of “national security” have also been more proactive than others but their focus is of course not on constructive solutions.
Of particular concern are the thorny political problems associated with mass migration which is pretty much the only way that climate catastrophes could be adapted to without loss of life on a truely massive scale (assuming current technology and resources). It would take a long time to establish an acceptable framework which would allow migration on an unprecedented scale. Fortunately there is no indication that we are facing imminent climate catastrophes. But planning needs to start yesterday.
Comment by Anonymous Coward — 24 Oct 2010 @ 9:23 AM
Aside for Jim — when you get a persistent commenter like TTTM, Google the name +climate.
You’ll find he’s, how to put it politely, often popping up in ways that get used by others to point out how the climatologists don’t treat him respectfully. Kind of an ambulatory strawman.
Of course anyone can pretend to be anyone else, “TTTM” could just be a handy label used to set up this kind of, er, intervention.
It’s hard to talk about the tactics used to set up bloggers for trouble. MT did a good thoughtful commentary a few years back about how people will set up situations with considerable effort just to blow them up intentionally.
Just sayin’ it’s always worth the effort, when someone starts to get under your skin, to suspect intent and involvement in a group effort.
Only those who can see IP addresses know for sure if “TTTM” or anyone else is even coming from the same location each time the pseudonym is used in various places.
Further aside — on the same JC thread, one of the authors comments:
“…. Last year, April, Daniel Murdiyarso (a climate scientist) and myself (a forest biologist) published an overview of the basic ideas for a less-technical audience “How forests attract rain: an examination of a new hypothesis” in Bioscience. That got some media coverage: you may have seen some of it: e.g. Mongabay, New Scientist and Scientific American (please google). I’m happy to share the PDF if anyone wants it. The point is that many of the wider implications (including monsoons) are considered in a reasonably non-technical manner for those of you who might be interested in that….”
I’d recommend focusing on that comment and those by the other published scientists — and not wasting more effort on the TTTM distraction, which was a real waste of time both in this thread and in the one over at JC’s blog and utterly derailed three different blogs where there was some beginning of good discussion of the water vapor/humidity issues.
How Forests Attract Rain: An Examination of a New Hypothesis
Douglas Sheil1,2, Daniel Murdiyarso2
1 Institute of Tropical Forest Conservation, Mbarara University of Science and Technology, in Kabale, Uganda. E-mail: DouglasSheil@itfc.org or D.Sheil@cgiar.org
2 Center for International Forestry Research in Jakarta, Indonesia.
A new hypothesis suggests that forest cover plays a much greater role in determining rainfall than previously recognized. It explains how forested regions generate large-scale flows in atmospheric water vapor. Under this hypothesis, high rainfall occurs in continental interiors such as the Amazon and Congo river basins only because of near-continuous forest cover from interior to coast. The underlying mechanism emphasizes the role of evaporation and condensation in generating atmospheric pressure differences, and accounts for several phenomena neglected by existing models. It suggests that even localized forest loss can sometimes flip a wet continent to arid conditions. If it survives scrutiny, this hypothesis will transform how we view forest loss, climate change, hydrology, and environmental services. It offers new lines of investigation in macroecology and landscape ecology, hydrology, forest restoration, and paleoclimates. It also provides a compelling new motivation for forest conservation.”
This reminds me of the way redwoods in N. Ca. are known to harvest water by condensing it from the air.
Consistent with ‘forests attract rain’ — this: http://news.mongabay.com/2010/1013-hance_fragmented_dry.html
“A new study in Biological Conservation has shown that edge forests and forest patches are more vulnerable to burning because they are drier than intact forests. Using eight years of satellite imagery over East Amazonia ….”
Tim the Tool Man,
I think the thing you need to realize is that in science you have to ask very precise questions if you want to get reasonable answers. “Something wrong” is not precise. You can do one of two things with the criticism you have gotten here (which you must admit was mostly constructive and not personal): You can run off and whine about being misunderstood on Climate Fraudit or WTF or you can try to reformulate your question so that you and we can learn something of value from it. Remember, that much of the physics in the models is very tightly constrained. That is why most of us are having trouble visualizing exactly what you are getting at–what sort of “something” you mean. Presumbably, you have the same trouble, as you have been unable to refine your comment.
Yes, climate is complicated. Yes, the models are simplifications. However, remember that the goal of the models is to elucidate the factors that are important and so maximize predictive power. They do that quite well.
Comment by John E. Pearson — 24 Oct 2010 @ 11:13 AM
I repeat my question:
I read “atmospheric CO2: principal knob governing earth’s temperature”
In your “experience” you are zeroing all the non condensable gases and all the aerosols.
The aerosols have a GH effect but they have also an effect on the clouds
So what is the influence of aerosols on the clouds in your model?
[Response: In the simulation described in the paper, only GHGs were zeroed (not aerosols). This configuration didn’t have an a priori calculation of the aerosol indirect effects so nothing would have happened anyway. We could re-run it with the configuration including that, but you would have to think carefully about what you were testing – no aerosol emissions anywhere? no land emissions? what about sea salt? dust? etc. – gavin]
166, Jim in comment: The history of land and ecosystem “management” is littered with the residue of the unforseen consequences of similarly ill-conceived schemes which appeared, at the time, to be simple, “can’t fail” ideas. Ecosystems are exceedingly complex things that defy easy analysis or prediction, and make no mistake, you are talking about massive rearrangements to the earth system with almost entirely unknown consequences, logistical issues, feasibility questions etc.
True enough. but it might be worth some tries, as part of a coordinated effort to prevent the end of civilization as we know it, e.g. to prevent the occurance of simultaneous crop failures in all agricultural regions that Barton Paul Levenson warns us of. “Massive rearrangements” are the solutions to global warming,
[Response: They are?–Jim]
and the complete consequences are not known to any of them.
Reforestation and afforestation are among the originally listed “stabilization wedges” surveyed in the journal Science and reviewed from time to time. Only the occasional claim that they might be a panacea is objectionable.
[Response: And the idea that you can essentially turn all the world’s land into forest if you just move enough water around…Jim]
Comment by Septic Matthew — 24 Oct 2010 @ 11:53 AM
Regarding TTTM, models, risk to science…
This is still ongoing argument only because “Tim” has a flwed definition of models, science, and the process of constructive debate…therfore it is simply wasting time as a distraction. As an easy hypocrite, I’ll waste a couple more lines…
Models are a product of science, not the other way around. Flaws found with fundamentals of model cannot “risk” science, only belief in output of model for particular run. Science is never “at risk” from ANYTHING because it’s the nature of science to be defined by sum of knowledge, therefore new knowledge STRENGTHENS science and becomes science, doesn’t cause it to collapse, only redefines parts of it and encompasses the whole…like above references to difference between card house and jigsaw puzzle.
Debate involves examining details of argument, defining key terms, considering validity of opposing viewpoints, eliminating logical fallacies, and UPDATING argument when such improvements have been made…rather than constantly returning to original raw argument even after it has been repeatedly shown to be based on false premises.
@151, I think you’ve identified a catastrophic failure in TTT’s model of climate science.
His ‘house of cards’ model predicts flaky, massively-collapsing knowledge regularly overturned by disturbance. Maybe that’s the impression that certain media sources (including the MSM’s “X was wrong!” headlines) gives.
The ‘jigsaw’ model is more accurate, and more robust: it better predicts how whole revolutions in one area (“oh, this yellow is flowers, not a car”) can only improve the picture and don’t necessarily overturn the existing corners, pond and ducks. Even if you literally turn some upside down (“hey, this isn’t a second duck, it’s a reflection of the duck”) the rest isn’t necesssarily overturned, just informed (“I guess that means there’s got to be more water tiles…”)
TTT, I humbly suggest that you update your model of how science works.
Septic Matthew:”If someone has a comprehensive and well written exposition of how increased temps are supposed to lead to increased/unchanged/decreased cyclone intensities, I would like to have the link. Nothing I have found on my own to date includes all the factors.”
Try Kerry Emanuel’s page. Depending on your level of understanding, pick the appropriate article. Short story: the Power Dissipation Index for Atlantic hurricanes correlates strongly with Atlantic sea surface temperature. This result has stood up for the last 5 years, and hence decreasingly likely to be incorrect as time moves on.
I said the albedo change was a one time event. I think you misread the actual written words.
Yes, the effect of the albedo change is to create a higher rate of reflection that will continue to exist and have an ongoing effect. But the rate of reflection will not change over time. In other words, the reflection coefficient will be fixed according to the new albedo.
As carbon dioxide is removed on an on-going basis, the impedance to upward radiation will be continuously decreased, and the rate of IR band energy transmission will continuously increase. In absence of other sources of CO2, the total atmospheric CO2 will continuously reduce as forest growth continues. This is what I meant by the term ‘integrative’ effect.
Re 168 Jim Bullis, Miastrada Co – if a particular method of CCS or otherwise reducing net CO2eq emissions efficiency is too expensive, another (clean energy, infrastructure, efficiency, agricultural practices, diet and lifestyle, peridotite, biochar) will be favored, if in an environment that taxes emissions and pays for sequestration at the same rate. Have you looked into the peridotite solution?
Since this is a free-wheeling blog post, I thought I’d introduce another issue here that deserves attention – tropospheric amplification (or the lack of it). Basic principles, including Clausius-Clapeyron, observed increases in tropospheric humidity, and increases in troposphere height all imply that tropospheric warming in the tropics should exceed the rate at the surface. Earlier observations conflicted severely with this expectation. As biases in the instrumental record have been corrected, observations and theory have grown closer, but it’s still not clear that the conflict has been adequately resolved. Most observers believe the ultimate resolution depends on correcting remaining putative cool biases in the troposphere (many authors) and/or warm biases at the surface (Pielke), but is there some additional phenomenon that also contributes to the disparity?
The alleged absence of a “tropospheric hotspot” (it’s actually a very cold spot with an elevated warming rate, but “hotspot” is the common term used) is a standard argument of contrarians. It involves a nonsensical claim (a la Monckton) and a more legitimate one. The nonsensical version holds that tropospheric amplification is a “fingerprint” of CO2-mediated warming, and its absence proves the absence of AGW. Of course, the amplification is independent of the cause of warming. The more legitimate claim is that current climate sensitivity estimates require a strong water vapor feedback, and the failure of the troposphere to warm in accordance with the expected influence of tropospheric water is evidence that the feedback (and hence climate sensitivity) have been overestimated. (I would note parenthetically that amplification is more a function of the negative lapse-rate feedback due to vapor condensation than to the positive greenhouse gas feedback from increased water vapor, but of course, the two are linked by theory).
My questions here are not intended to challenge either AGW or climate sensitivity values, which are well documented from multiple sources, but rather to inquire about current attempts to resolve the “hotspot” question. First, are there new data bearing on the issue?
Second, although I don’t have the data in front of me, it is generally concluded that short term (e.g., one year) datasets show the expected amplification, but multidecadal analyses do not. This raises an interesting mathematical conundrum. If one serially adds short term amplifications to form a long term sum, they should continue to show amplification. If this has not happened, it suggests discontinuities in the record – e.g., either missing observations or downward jumps at various breakpoints, perhaps implicating changes in instrumentation. What is the evidence bearing on this? Comments would be welcome.
[Response: There is a big confusion here. First off (as you rightly say), the hotspot is expected to occur due to any surface heating simply because there is a small modification to the moist adiabat as a function of temperature. This is well understood physics combining only a statically unstable column with the Clausius-Clapeyron equation. This has little or nothing to do with water vapour feedback however. You would still get the same pattern if water vapour had no impact on the greenhouse properties of the atmosphere. It does however have to do with the ‘lapse rate feedback’ (which is negative – i.e. it damps the increase of surface temperature driven by a specific forcing).
To see this think about a system where the temperature rises are uniform with height, and the whole column adjusts to the forcing at the tropopause. Now think of a system where the temperature changes get larger as you go up. The net IR emission out of the system is a weighted sum of the contributions from different levels, and goes up at each level as a function of temperature anomaly. Compare the two situations where the surface temperature anomaly is the same. Which system has the higher IR emission? It should be clear that it is the one with the ‘hotspot’. Thus the lapse-rate change allows the surface temperature change for a given forcing to be smaller!
Now I don’t think there is a ‘missing hotspot’ (for reasons we have gone over in previous posts), but if it isn’t there, the implication is that it would imply a greater climate sensitivity than the current models show, not less. – gavin]
I look for the relevance of the 600 trees per acre put forward as a reference. Basically, a 600 tree per acre situation would be a scrub second growth forest. Old growth forests probably are already around 50 to 100 trees per acre.
A well managed forest project could start with 100 or more trees per acre, and as growth continued, some could be harvested for wood (for permanent wood products). Full growth could well involve 50 very large trees per acre.
Restoring a forest would go slowly if one depended on nature to thin the scrub tree woods to allow for larger trees to grow.
I never know how to deal with people with “big” ideas. Big, impractical, impossible ideas, with holes you can sail an armada through.
Do you smile, take their money and run? Do you point out the gaping holes? Do you waste time trying to bring the “vision” down to earth, and make it workable?
In the past, I have mixed all these approaches, but impossible visions usually remain impossible. It usually depends on the visionary. Can they see the holes in their plan?
Take Jim Bullis’s ideas. If I had an idea for an impossibly efficient car (and who hasn’t?) then it would remain strictly fantasy. I know what is needed to turn such an idea into reality (a lot of money, a lot of time, and a lot of people). I don’t think I would ever have the hubris to say “10 times more miles on a gallon” based on nothing but “aerodynamics”. Before I said something like that, I would want to go back to university, study engineering for a few years, then run some models. Even then, I would want expert advice, too.
How do you deflate these ideas without killing that invaluable inventive mind?
Jim, sometimes big dreams can work. But they have to be practical. SpaceShipTwo will in the end cost in excess of $300 million. Solar Impulse has cost an estimated $88 million. It’s a long, difficult road, and nobody is going to buy or license something which is only an idea.
Special Theme: Climate Change and Ecological Restoration
Rethinking Conservation Practice in Light of Climate Change
“Predicted changes in climate present unusual challenges to conservation planners, land managers, and restoration efforts directed toward preserving biodiversity. Successful organisms will respond to these changes by persisting in suitable microsites, adapting to novel conditions, or dispersing to new sites. We describe three general categories of strategies for restoring and managing natural systems in light of likely changes in future climate ….”
I read the article about JC in Scientific American. She began her dialog with the skeptics, because their blogs had scathing attacks on her paper attributing an increase powerful tropical cyclones to global warming. The article does not say whether she modified her conclusions or whether she convinced the skeptics that her conclusions were correct. If neither of these happened, she not only wasted her time, but also became a shill.
Curry also asserts that “scientists don’t even know the amount of warming a doubling of CO2 alone would cause.” Say what? Does she really believe that? I sure hate to think we will hear that comment echoed for the next decade.
Such scoffing! As if you had read the material at the website limked to my name. It does not appear that you are inclined to read complete sets of words.
True, it is not well organized, but if you take the time to leaf through the various pages, and actually read some of the stuff, maybe you will see the reason for the headline claims.
Making a vision workable does certainly take time.
And you say everyone has had an idea for an impossibly efficient car? Most who learned the laws of thermodynamics have been saved from this exercise. And these folks would not assert that they, or most people, had such an idea. But do tell us more about these ideas.
As to the forestation idea, the hostility to such an elementary concept is quite amazing. I was hoping someone would have more to say about what China is actually doing in this regard, especially in light of the actual stated plan to do something like this.
The factor of ten in efficiency for the car I am planning comes from the product of 2 and 5. The 2 comes from making the car half as wide using tandem seating, and the 5 comes from adapting the well know airship form that has a drag doefficient that is about a fifth that of most cars.
The main impediment to making a car like this is the resistance of the market, represented by people who would rather live in global warmth than to change the way they ride in cars. These folks have a propensity to construct arguments about why global warming is a false threat. These folks are similar to those who construct arguments about why various solutions are impossible.
Perhaps I should spell out the relationship of market resistance to an idea to availability of money to implement that idea.
I think you are right about how less expensive things have to emerge. Thus, I was led to look at new forests as such a ‘less expensive thing’.
Efficiency clearly has potential, but not nearly enough at affordable costs, hence it does not get into the ‘favored’ category on a large enough scale. Biochar has real potential as an ancillary activity related to the forestation project.
I will look again at peridotite, but of the chemical possibilities, the EPA study at:
did not put that into a lead position as a CCS answer, even though there is some mention of similar things.
Apparently you do not accept my point that an environment, where CCS is paid for by any mechanism at all, is an environment where our industrial economy will fail. That is exactly what I am trying to avoid.
As to efficiency, there are some real limits to what can be done affordably. I have frequently mentioned household shifting from electricity to natural gas for heat using appliances. These include clothes dryers, cooking appliances, and even refrigerators and air conditioners (vapor diffusion). This has gained little traction, partly because there is no profit for the utility in this, and partly because our regulators do not understand the magnitude of heat loss in central power plants. Hence vapor diffusion cooling devices were excluded as inefficient devices, based on their local coefficient of performance only.
But that is another idea; not really all that big.
Cogeneration is a big efficiency answer though, maybe of silver bullet rank, where heat engines in motor vehicles are adapted to deliver discharged heat for household use on a distributed basis.
My efficiency silver bullets are cars, trucks, hybrid wheels for trucks, and distributed cogeneration as mentioned. Add to this a massive new forestation project, and that might be enough silver bullets to get the job done.
Northern California redwoods take on a lot of moisture through there needles, as you say. However, the redwoods of Sequoia National Park seem to get along without doing that.
Thanks for all the forestation references, which are loaded with background information. However, I am not talking about reforestation, though that is only because it is insufficient.
The amount of world forestation that has been eliminated is enormous, and I would be delighted to have the land to work with, but the art of the possible tells me to look for affordable land.
I honor the idea that natural state of things is highly desirable, but also note realistically that we lost that condition long ago. And the possibility to go back to the natural state has also been lost.
I got tripped up on peridotite. You must be referring to the deep rocks that some said could be used to react with CO2.
The EPA cost analysis only covered what it would take to capture CO2 from the stack. So whether the CO2 simply stayed put in huge underground caverns or reacted with natural rock, it would seem to not matter.
Does latex just work now? Is there an internet flavor being used so that one doesn’t have to wrap mathematical formulae with $ or other standard TeX equation offsetting commands? Anyway it is awesome that tex works on RC now!
Comment by John E. Pearson — 24 Oct 2010 @ 6:56 PM
To Gavin re my previous comment (208). I appreciate your response but it left my two main questions unanswered. Like you, I correlated tropospheric amplification with the lapse rate feedback, but I also mentioned that water vapor feedback and lapse rate feedback are linked via the increase in absolute humidity with preservation of relative humidity or close to it, so that the two phenomena are not independent. Is there a mechanism that would delink them and allow the water vapor effect to operate without a change in lapse rates toward the moist adiabat?
[Response: I think so. The magnitude of the LR feedback depends on the fact that the rate of change of sat. vapour pressure with temperature is itself a function of temperature. If CC was linear instead of exponential, you’d still have a WV feedback but no LR feedback. ]
My two earlier questions, in essence, were as follows. First, are there new data either reconciling observations more conclusively with theory, or alternatively identifying additional tropospheric mechanisms that would reduce the amplification? Second, and more specifically, how can short term amplification that has been reported be reconciled mathematically with a long term failure to see the same amplification. In particular, are there discontinuities or jumps in the record that could explain this?
The main reason I bring this up is that I believe the way we answer some of these questions should ideally satisfy open-minded readers who are told that the lack of the “hotspot” invalidates current climate sensitivity estimates based on water vapor feedback. I don’t yet believe that these readers will be satisfied with what has been said so far even if the evidence for substantial climate sensitivity is otherwise compelling on the basis of multiple lines of evidence. I can guess that the issues will ultimately be resolved through further corrections to instrumental biases, but that remains a guess, and I’m not sure we can confidentally conclude it’s more than that.
[Response: I can’t tell you how this will be resolved over the next few years. There were clear inhomogeneities in the satellite records in the early 1990s which doesn’t help, and radiosonde coverage remains poor in the tropics. I find it very difficult to think of any reason why the moist adiabat works fine for every timescale except decadal. AR5 might show some interesting sensitivities to ozone changes which are being widely incorporated into this generation of models, but I think that clarity will just come with longer records. – gavin]
This is particularly relevant to a thread in which both the AMO and modeling have been themes. This simple, zero-dimensional, zero reservoir (or box) model manages to do quite well the explaining the 13 decades of GISTEMP. Some have called this model 8th grade or reshman. I wish it was so. Anyone introducing aspects of climatology to students in middle school or later is welcome to make use of this study.
The revisions include fixing some minor typos, adding a reference (Tol & de Vos, 1998) suggested by James Annan and a mention of a slight variation which assumes a pre-existing linear trend, as in the cited paper. In addition, there are updated and additional links to other simple models, aspects of the AMO, etc.
Comment by David B. Benson — 24 Oct 2010 @ 7:08 PM
where the prior decade’s average lnCO2 provides the forcing with the assumption that the first decade, the 1880s, is unforced by CO2. The final term is the adjustment for the way GISTEMP anomalies are reported and there is a constant F to give the temperature change due to the forcing by lnCO2. This constant is traditionally reported for a doubling of CO2 concentrations, 2xCO2,
OGTR for 2xCO2 = F*ln(2).
OGTR stands for Observed GISTEMP Response and is estimated below to be 2.280 K for 2xCO2. Using just this, we have the following table in which the residuals are the differences between GTA and the AE estimate. The standard deviation of the residuals is sd= 0.052 K and the coefficient of determination, R^2=0.953, shows that almost all of the variance in the 13 decades of GISTEMP’s GTA is explained.
The diffs are the differences lnCO2(d)-lnCO2(d-1). If these were all the same the additional forcing would be the same for every decade. This is approximately so before the 1940s. Notice the dramatic changes from the 1940s on.
All in all, quite decent for a simplified model based on the physics of the atmosphere plus the shallow ocean, but there is certainly some other effect(s) causing the temperatures to swing (wobble) more widely than can be explained by lnCO2 alone; there is the deep ocean. On the centennial scale of the instrumental record so far, the deep ocean is approximately just a heat sink, but one with a rate which varies on a multidecadal scale. Fortunately, there is a proxy for this internal variability, the Atlantic Multidecadal Oscillation (AMO): http://www.aoml.noaa.gov/phod/faq/amo_fig.php http://www.aoml.noaa.gov/phod/amo_faq.php http://www.aoml.noaa.gov/phod/faq_fig2.php
Although linearly detrended over 150 years, the AMO for the 13 decades of interest has an average of -0.014 which is removed for this decadal study. After that operation as well as decadal averaging, we have AMO(d) for each decade d. Our formula to account for internal variabilty, in addition to lnCO2, is
AEP(d) = AE(d) + A*AMO(d)
where A and F are jointly estimated by minimizing the standard deviation. Incidently, the AMO also will include some effects of lnCO2 (as this forcing is not linear in time) and also nonlinear portions of other forcings which affect the North Atlantic. Still, this works well to refine the estimate and better approximate GTA.
 Tol, R.S.J. and A.F. de Vos (1998), ‘A Bayesian Statistical Analysis of the Enhanced Greenhouse Effect’, Climatic Change, 38, 87-112.
Comment by David B. Benson — 24 Oct 2010 @ 7:16 PM
“The factor of ten in efficiency for the car I am planning comes from the product of 2 and 5. The 2 comes from making the car half as wide using tandem seating, and the 5 comes from adapting the well know airship form that has a drag doefficient that is about a fifth that of most cars.”
See, Jim: this is why you don’t get taken seriously. I don’t want to derail RC onto a completely unrelated subject, but this is the same problem you have with trees. You simply don’t know how much you don’t know.
I don’t know much about aerodynamics, but at least I know that I don’t know much about it. On the other hand, I know that you haven’t considered the very unstreamlined wheels of your car. I know that you haven’t considered that drag is only one factor that affects fuel efficiency. I know that you haven’t considered other more convenient aerodynamic shapes. I know that you haven’t considered that drag is proportional to speed squared, so is far less significant at lower (urban) speeds.
I also know that your response to constructive criticism is to lash out and deny that there is a problem, so I imagine you will never get any further with your ideas.
The supposed missing hot- spot would affect the water-vapour feedback to the extent that the upper atmosphere would have a lower saturation vapour pressure and thus could ‘hold’ less water than if the changes were moist-adiabatic. Do you know if there is a simple calculation (or if anyone has done this with a radiation model) for which effect is larger?
The calculation I would envision would be to take a Radiative-convective equilibrium profile and then increase the CO2 and change the temperature profile until the TOA fluxes are balanced again. The temperature change would be constrained by fixed relative humidity, and in one case to be moist adiabatic, and in another case to have the delta T be the same throughout the profile (with some allowances for what happens at the tropopause).
I had at one point convinced myself, using simple emission level arguments, that the water-vapour feedback wins out, but I can’t for the life of me remember what I did.
Of course the whole thing is a bit artificial, as there is no reason to suspect the upper-tropospheric relative humidity must remain constant, especially if the troposphere is not staying on a moist adiabat.
201, comment by Jim: [Response: And the idea that you can essentially turn all the world’s land into forest if you just move enough water around…Jim ]
“essentially all the world’s land” is a straw man. No one has even claimed that “all” of Eritrea could be afforested with salt-tolerant mangroves, only enough to make an important improvement.
Do you know of any solutions to global warming that do not entail “massive rearrangements”? The proposal to eliminate that 45% of US electricity that is generated from coal, and replace it with some other electricity, is a “massive rearrangement”, wouldn’t you agree?
adding to Didactylos’s 227, I had thought that rolling resistance was half of the total resistance at 55 mph; this can be reduced by reducing weight. Electric cars could have a motor for each wheel, which could dispense with some of the resistance in the transmission, though I’m not sure how much of a difference that would actually make.
Of course, EVs, HEVs, and PHEVs don’t need to run the engine while idling, which is really a whole other aspect…
If you want to submit your car to the market, fine with me. It would require a lifestyle change for people, though, just as some other ideas would. I like the option of riding in a car 2 by 2. I’d be willing to sacrifice that to save the planet, of course, but there are other things I could do instead, which I might prefer when it comes to my own lifestyle (every little bit helps, and it’s good to have options, and we may need to use many, but you don’t need ‘every’ little bit, just enough bits). I’m not hostile to your idea, but while your car may have some market share but I wouldn’t expect it to dominate anytime soon. I think you could change the shape of the car and still get some savings in aerodynamic drag without having to limit the width to fit one seat per row.
How much of the heat generated by your car’s engine can be utilized?
If your car’s efficiency can be so much improved, that apply to an PHEV or PEV with the same improvements.
Of course, improvements to any energy conversion in use are to be welcomed.
Wind, hydroelectric – not much of a waste heat issue with centralized or distant power. CSP, CPV, PV power plants – some heat might be wasted but you have to consider the trade-offs of different locations. CSP could be used for heat for some industrial processes and also with community heat systems. How efficient are community heating systems? There are also ways to reduce heating needs. Household furnaces might be made into cogeneration plants using TPV; there is also direct solar heating (passive solar – just need to make sure the windows and skylights are well-insulated) A lot of energy can come through a window. Rooftop PV can be made into cogeneration by heating (or preheating) water. Solar water heating is a great idea. Note use of cogeneration plants – in cars or otherwise – may have reduced usefulness in warmer climates, depending on the economics and volume of solar water heating and its possible pairing with PV.
Some efficiency improvements really don’t add much net cost. Maybe see work by Stern.
We don’t need to dot all the i’s and cross all the t’s – if we have good policy, the market will do a lot of the sorting out (I think it would be funny if not so sad that so much of the Laissez-faire crowd is so dead set against the use of the market to solve a problem!)
The most efficient use of a mass of peridotite for CO2 sequestration may involve concentrated CO2, but I have wondered if crushing the rock and dispersing it could economically work to take CO2 from the air (thus divorcing it from CO2-producing activities) (this could also somewhat double as an albedo contribution, if the mineral dust is allowed to go windborn over the oceans, and then gets into the oceans, buffering the pH while taking up CO2); some time ago another commenter at RC posted some energy-use info about rock-crushing which looked promising. Presumably this could also be done with some effect with more common rocks, so long as there are Ca silicates involved – or even some other cations (?), or even in carbonates, that wouldn’t allow carbonate rocks to form but it would help buffer the pH and help the oceans take up more CO2 without needing to dissolve the corals and seafloor sediment, etc.)
Now going off on a much less realistic solution: candy-wrappers and potato chip bags (commonly a high-albedo inside surface, at least in the visible portion of the spectrum) – how much area would they cover (and importantly, what is their LW emissivity, because a low LW emissivity could be counterproductive).
Also in your comment, you clearly think urban speeds are a big deal.
They are not. Most miles are not urban miles. If we are trying to reduce CO2, urban miles are not all that significant as a part of the picture.
I am happy to let someone else fix the urban efficiency problem. That will be a small thing which will make people feel good, but will accomplish little.
Also, the picture is never all that simple, but if one wants to say anything concisely, one has to ignore a lot of detail. The question is, is the big issue addressed.
There’s a bit of a difference with re-afforestation, new ‘forests’, using water and transporting water. My own initial gradualist approach is to the suburbs of major cities. Most of these areas are covering the best agricultural land – the reason the place was originally established as a good place to live. Therefore a prime area for tree planting as well as growing food.
These cities all have “stormwater” systems. Why water that falls on roofs and other domestic hard surfaces should be treated as a waste disposal problem rather than a best use problem is an eternal mystery to me. (I admit I live in the driest state in the driest inhabited continent, but hey ho.) Redirecting water towards the growing area of one or several trees reduces the “waste management” aspect of rainfall and grows a carbon sequestration unit which might also produce food.
I don’t know how many suburban blocks there are in USA, but putting 1 or 2 or 3 additional trees within the blocks or on the street verges would use “waste” water and achieve a result. Someone could do some numbers to work out the optimum number if 30 or 40 or 100 million trees is the target. But this approach is nowhere near what’s required even though it would be helpful.
Reducing the use of geological carbon resources is the best way. And USA has more scope for making a big impact by efficiency improvements alone than practically any other country.
– I think we can observe H2O vapor mixing ratios (a measure of specific humidity) independently of observing the ‘hot spot’. Including measurements from satellites.
How might H2O vapor concentration increase without a hot spot? (PS I don’t really have a conclusion to answer your questiong, just a bunch of interesting information to consider:)
1. first, consider an atmosphere with no precipitation from clouds; the air moves around, occasionally coming near a wet part of the surface and equilibrating it’s water vapor pressure with something proportional to the the equilibrium vapor pressure at that temperature, with some adjustment required by molecular diffusion and eddy mixing and the temperature gradient (that’s actually something I’d have to review again before I could get any more detailed). The total water content of the air would then tend to change with the equilibrium vapor pressure at the surface, with air aloft (above the cloud base) having some portion of that water in condensed form. In this alternate reality, the moist adiabatic lapse rate could be ‘the’ adiabatic lapse rate for the majority of the atmosphere (pseudoisentropic surfaces would become truly isentropic surfaces).
2. Where does dry air come from? Consider a ‘conveyor belt’ model of overturning where air rises with H2O condensing and some precipitating out. It then reaches some height, moves laterally, and sinks (PS some more complex motions might be folded into such a depiction – for example, upper level air could gain cirrus clouds from deep convection, then mover around, sink a little and clear up, then rise a little with some condensation/deposition again, sink, as it moves through Rossby and gravity waves, or have clouds appear and disappear moving across lines of latitude with radiative heating and cooling). The remainging cloud water evaporates with adiabatic warming; as some was removed by precipitation, the cloud base in the sinking region is higher and the H2O vapor concentrations below that are smaller. If the efficiency of precipitation were increased or reduced (PS this is an aspect of a (so far as I know) speculative negative feedback proposed by Lindzen – of course you know what tends to happen with that category of phenomena, but it’s interesting to consider the physics involved anyway.), then there would be less or more water vapor in the regions of sinking air. What if the cloud base in the ascending region were raised or lowerd? Of course, that would (other things being equal) require a decrease or increase in relative humidity in the surface air underneath. But if the height of the cloud base were raised, that would reduce the ‘hot spot’ by having a greater thickness of air tend toward a dry rather than moist adiabat. Would the cloud base change if relative humidity underneath were fixed while the surface temperature was increased?
(PS precipitation would actually slightly (and I mean slightly – you can ignore the liquid water heat capacity without much error) decrease the moist adiabatic lapse rate below the freezing level by removing some mass of liquid water, thus reducing the specific heat of the air parcel. However, then there is less latent heat available from freezing of that water (directly or via deposition of vapor onto ice crystals) – though that is roughly an order of magnitude smaller than the latent heat of condensation, per unit mass of water.)
But the way water vapor is determined is more complex than that because there is mixing (cumulus clouds commonly entrain some dryer air, reducing the total water per unit air in the cloud), … (clouds can precipitate into layers of air beneath with subsequent evaporation).
After evaporation is completed, dry sinking would follow a dry adiabat, except for radiational adjustment of temperature. Sinking is often much slower and over a much larger area, with radiative cooling being able to significantly affect temperature and even balance adiabatic warming (as in a steady-state Hadley cell) while moist updrafts are faster and concentrated and with radiation having essentially no role in changing the temperature within a cumulus cloud. But lapse rates larger than the moist adiabatic lapse rate can develop and remain until a trigger gets the moist convection going (PS although in the tropics I’ve gotten the impression that gravity waves tend to communicate the effects of moist convection so that larger regions adhere to a moist adiabat at the same vertical levels – which would make sense given a weak coriolis effect that would allow horizontal temperature variations to persist in the face of the tendency for thermally-direct overturning). This introduces some wiggle room.
Meanwhile, an increase in the greenhouse effect (note, regional drying or reduced cloud cover, especially low cloud cover, would have the opposite effect) would tend to decrease the diurnal temperature range over land; since convective heating of the bulk of the troposphere tends to come somewhat preferentially from the warmer (and more humid, of course) times and places, reducing horizontal and temporal surface temperature variations might be expected to increase the average lapse rate near the surface (because the average lapse rate includes contributions from times and places with stable air, and such an evening-out of temperature could tend to bring the air in general closer to having neutral stability). In particular, radiative forcing or feedback, either LW or SW, at the surface, can warm the surface without an immediate convective adjustment through the bulk of the troposphere at places and times with stable air masses at the surface – such as at higher latitudes, particularly in winter (now remember the sea ice albedo feedback).
Horizontally-large scale overturning of the troposphere is driven, and in part, organized, by differential heating both horizontally and vertically (it would become more sluggish without the heating below and cooling above, but can continue with horizontally differential heating alone, and can sustain a lapse rate that is stable to localized convection of all types). Reducing the horizontal differential heating would tend to slow this type of circulation, which would tend to allow a larger lapse rate to develop and favor more localized overturning.
Meanwhile, if the sinking dry air spends a lot of time over dry land and a little time over a wet surface before rising to various heights with moist convection with precipitation, then the average humidity would be less, relative to the case where the sinking air gets humidified soon and waits a longer time before rising with moist convection and precipitation.
Wind helps speed evaporation (eddy diffusion, or in the case of horizontal variations, dry air advection). On the other hand, still air near a wet surface could over time get closer to 100 % RH (but in a thinner layer/smaller region! However, if the supply of dry air to the boundary layer were slowed…). Meanwhile, molecular diffusion should occur faster at higher temperatures (right?), so RH should tend to increase with increasing temperatures, other things being equal, … but I’m guessing that would be a very small effect.
BPL : thank you for the very clear demonstration that you can get have a fairly well determination of the influence of CO2 on temperature , with a simple spreadsheet and basic statistical tests. But I have a question ; if the sensitivity is already fairly well determined with your simple statistical tests, at a ridiculous computer and human cost, what do you expect from much more detailed ( and enormously more expensive) heavy computer simulations ? which uncertainty could be much more reduced with them, since there is already hardly any , FAPP ?
“in experiments where atmospheric constituents (including water vapor, clouds, CO2, O3, N2O, CH4, CFCs, and aerosols) were added to or subtracted from an equilibrium atmosphere”
I am not a good reader…
[Response: This refers to the earlier set of experiments (also discussed in Schmidt et al, 2010) where each constituent was zeroed out in turn while everything else was held constant. Different kind of thing entirely. – gavin]
But what is your opinion concerning the sharp rising of the cloud cover in the beginning of the experience in fig 2?
[Response: Big expansion of low cloud cover. – gavin]
It’ll be interesting to see if they’re able to pull off the aquaculture, hydroponics and algae production as planned. The only inputs are said to be a) sunlight, b) landfill gas (mostly methane), and c) some fish food.
Jim Bullis: if you want to contribute something these days, it usually has to be in the detail. Yes, there are completely novel ideas out there. But lower drag coefficients, tandem cars and afforestation are not among them.
This doesn’t mean that there isn’t a huge amount you can do (in the detail, and in the practicalities of turning vision into reality).
But detail means you can’t be a “big picture” guy. You can’t just walk in and make sweeping statements that are almost right.
For example, you say: “Most miles are not urban miles.” That’s not true. In the US, 60% of driven miles are urban vs. rural [Federal Highway Administration].
Please note that the people commenting on your approach are not deniers: we are all in favour of major action to combat global warming. Also, you’re not wrong – afforestation is generally a good thing, and aerodynamics really is an area which can have a big impact on fuel efficiency. Have a look at some solar vehicles – they use a superstreamlined form, and the wheels are barely protruding to maximise the advantage. They are long and thin. They genuinely have a tiny drag coefficient, and it helps a lot.
What you get from more precise models is, of course, more precision. In particular, you narrow the confidence interval on CO2 sensitivity. Models have been instrumental in narrowing the 90% CL for sensitivity to 2-4.5 Kelvins/doubling. This is extremely important on the high side, as the adverse consequences of warming increase dramatically (exponentially, perhaps?) with sensitivity.
I was fortunate to attend Sam Harris’ talk yesterday at Caltech. The subject was his new book: “The Moral Landscape How Science Can Determine Values. It’s sad when I see scientists talking about how bad things can get re global warming. They get ridiculed and threatened. I think if we REALLY know how bad things can/perhaps will get we should be talking about it to the public. And many are thanks goodness. But then they will be referred to as crazy “alarmists”.
I guess my point is I want to thank all of these scientists for attempting to get the information out there. I am a layman and while I am more educated than most laymen on this subject I am far from where many of you guys(and gals) are.
Anyway, thank you all for having the passion that you do!
There’s a bit of a difference with re-afforestation, new ‘forests’, using water and transporting water. My own initial gradualist approach is to the suburbs of major cities. et seq.
I think that the solution to global warming is some of everything that is useful, gradually implemented everywhere, over decades. Most of what is useful to solving global warming is valuable in other respects, and should be pursued even if global warming isn’t occurring. Harvesting more wastewater is one of the practices that should be adopted more widely, and used to irrigate varieties of species. This is increasingly common throughout California, to pick one place, and waste water is used to irrigate golf courses, palm groves, and avocado groves.
Comment by Septic Matthew — 25 Oct 2010 @ 10:58 AM
235, Hank Roberts: SM, forget global warming, focus on ocean pH change — it’s faster and simpler.
I disagree for two reasons: (1) global warming has been prominently discussed, and many methods for reducing CO2 have already been promoted and started to reduce it; (2) pH reduction from 8.1 to 7.7 or 7.3 isn’t (pace Jane Lubchenko) going to dissolve chalk, much less animal skeletons.
Rhetorically, the continual changing of the alarm from “cooling” to “overpopulation” to “resource constraints” to “unsustainability” to “warming” to “change” to “disruption” to “biodiversity” to “ocean pH” contributes to the perception that scientists are in a continual state of alarm about everything, or else simply misanthropic. Everything known to reduce anthropogenic CO2 is good for other reasons (as far as I am aware), and can be pursued to achieve multiple goals.
Comment by Septic Matthew — 25 Oct 2010 @ 11:17 AM
PS for Mike G — not arguing against edge effects at all; those have many strong effects. Just noting there’s some possibility that the authors have a reasonable hypothesis to add.
Here’s an earlier paper along the same line. It’s not a strong effect, but may emerge from statistics: http://www.esajournals.org/doi/abs/10.1890/04-1675?journalCode=ecap
Webb, Thomas J., F. Ian Woodward, Lee Hannah, and Kevin J. Gaston. 2005. FOREST COVER–RAINFALL RELATIONSHIPS IN A BIODIVERSITY HOTSPOT: THE ATLANTIC FOREST OF BRAZIL. Ecological Applications 15:1968–1983. [doi:10.1890/04-1675]
“… significant positive relationships between tree cover and the number of rain days consistently emerge. The degree of forest fragmentation seems to influence this relationship, with patchier forests associated with fewer rain days; and tree cover also predicts interannual variability in rainfall.”
Septic Matthew wrote: \I think that the solution to global warming is some of everything that is useful, gradually implemented everywhere, over decades.\
That’s the \Too Little, Too Late\ strategy.
We don’t have \decades\ to try implementing \some of everything\ \gradually\.
The anthropogenic excess of GHGs is already self-evidently at dangerous levels, based on its observed effects.
We have perhaps ten years in which to essentially eliminate all anthropogenic CO2 emissions, and begin to draw down the already dangerous atmospheric excess, if we are to have any hope of averting the worst consequences of AGW.
If it is not in fact already too late.
Comment by SecularAnimist — 25 Oct 2010 @ 12:04 PM
Septic Matthew said: “Rhetorically, the continual changing of the alarm from “cooling” to “overpopulation” to “resource constraints” to “unsustainability” to “warming” to “change” to “disruption” to “biodiversity” to “ocean pH””
A nice straw-man. However, these problems are not changing names for a single problem, they are nearly all genuine, challenging problems that face us today. And the “cooling” bit is just typical denier nonsense that SM slipped in to see if we would notice. Good grief.
But if deniers can deny something as obvious as global warming, little things like biodiversity and sustainability are even more easily set aside.
Septic Matthew: your self-interest is not something that I admire. Quite the contrary.
The effects of atmospheric CO2 are reversible. If the burning of all carbon-based fuels were to cease immediately, the atmospheric CO2 level would degrade to pre-industrial levels. How long this would take is disputed.
At today’s higher atmospheric levels, more CO2 is removed naturally than was occurring at pre-industrial levels. This is the primary reason that CO2 levels have risen much slower than CO2 is being added. If we allow CO2 to rise unabated, the atmospheric level will rise until natural removal effects stabilize the increase. As we decrease the amount of CO2 pumped into the atmosphere, the natural equilibrium level will decrease. We do not need to eliminate CO2 altogether, just enough to balance the natural equilibrium processes.
Curiously, what is your ten-year time frame based upon?
Re my 234: (PS precipitation would actually slightly … decrease the moist adiabatic lapse rate below the freezing level … However, then there is less latent heat available from freezing of that water
On a related point, delaying the condensation would cause cooling near the cloud base with warming above; delaying freezing, such as could happen with insufficient ice nuclei, would have the same effect above the freezing level. Condensation usually is not delayed much (but > 100 % RH is generally required to transition from haze particles to cloud droplets – see Kohler curves) but it is common for water droplets to remain liquid well below freezing. (Unlike dry adiabatic processes, moist adiabatic processes can deviate from being perfectly adiabatic, isentropic, and reversable (they go hand-in-hand), even without precipitation and mixing, because of kinetic barriers to phase changes and the requirement of small-scale temperature and composition gradients necessary to drive heat and mass flows between coexisting phases – there is some thermodynamic disequilibrium and some production of entropy.) (Presumably faster updrafts would increase thermodynamic disequilibrium and result in an upward displacement of latent heating.)
Sinking is often much slower and over a much larger area, with radiative cooling being able to significantly affect temperature and even balance adiabatic warming (as in a steady-state Hadley cell)
Balance in the sense that the adiabatic warming and radiative cooling at a given location are balanced in steady state circulation; of course, following the air downwards, warming would still occur in that case.
while moist updrafts are faster and concentrated and with radiation having essentially no role in changing the temperature within a cumulus cloud.
Of course, moist ascent can also be gentle and over a larger area too (stratiform clouds associated with fronts, extratropical cyclones – though the upward motion is still generally faster than the corresponding dry descent in anticyclones).
(PS although in the tropics I’ve gotten the impression that gravity waves tend to communicate the effects of moist convection so that larger regions adhere to a moist adiabat at the same vertical levels – which would make sense given a weak coriolis effect that would allow horizontal temperature variations to persist in the face of the tendency for thermally-direct overturning).
Starting with geostrophic (or gradient-wind) balance, a diabatic thermal perturbation (or mechanical momentum perturbation) causes an imbalance between pressure and the coriolis (or that + centrifugal) force, which causes motion that generally causes the warmed* fluid to ascent and cooled* fluid to descent (* it’s the change that’s important here; the warmed fluid could still be colder than the cooled fluid) or something analogous with a momentum perturbation; this tends to restore balance; in the case of geostrophic balance, this is called geostrophic adjustment. Geostrophic adjustment involves a conversion between APE and KE (available potential energy and kinetic energy). In a closed cell, the motion could overshoot and then oscillate; this would be a standing (inertio-)gravity wave; out in the open, inertio-gravity waves are radiated, carrying some energy with. Some APE (or ‘anti-APE’ ? if the geostrophic adjustment was thermally indirect) can remain in place associated with a remnant of the thermal perturbation. How much remains depends on size. The geostrophic adjustment tends to spread out the thermal perturbation over a horizontal scale called the Rossby radius of deformation LR, which is inversely proportional to the coriolis effect. Thermal perturbations which are horizontally larger than LR will persist more, while much smaller perturbations will nearly vanish. LR does not limit how far the gravity waves can propagate; generally, though, viscosity and radiative relaxation of thermal perturbations will mechanically and thermally damp gravity waves as they propagate; as this occurs, the momentum associated with the gravity waves is ‘deposited’ in the fluid more ‘permanently’ (until something else happens!) .
…If atmospheric CO2 is stabilized at 450 ppm, only a very small fraction (~8%) of existing tropical and subtropical coral reefs will be surrounded by such water, and at 550 ppm, coral reefs may be dissolving globally. Cold water corals are also vulnerable and are likely to be affected before they have even been fully explored. By 2100, it has been estimated that 70% will be in waters unfavourable for growth. In the polar regions, model projections using current CO2 emission rates suggest that parts of the Southern Ocean will be undersaturated for aragonite by 2050. Aragonite undersaturation is projected for 10% of Arctic waters by around 2020, and by 2060, 80% of waters will be undersaturated for aragonite and calcite. This means the waters will be corrosive to Arctic calcifiers such as pteropods, and bivalves such as clams, which play a key role in Arctic food webs…
…Projections of the saturation levels of aragonite (a metastable form of calcium carbonate used by many marine organisms) indicate that calcification rates in warm-water corals may decrease by 30% over the next century (Gattuso et al., 1998; Langdon and Atkinson, 2005). By the middle of this century, calcification rates of many warm-water
corals are expected to drop to the point that they will be outpaced by erosion (Erez, 2008; Gattuso, 2008a; Manzello et al., 2008, Langdon et al., 2008), which would have serious impacts on coral reef ecosystems. Cold-water corals, which are found in deep waters, may also be in danger. These corals serve as habitat for many commercial fish stocks, and today virtually all of them are bathed in waters that are supersaturated with respect to aragonite. Yet by 2100, it is projected under the IS92a scenario that about 70% of these cold-water corals will be exposed to waters that are undersaturated with respect to aragonite, which would be chemically corrosive to their skeletal material (Guinotte et al., 2006)…
Dan H wrote: “Curiously, what is your ten-year time frame based upon?”
It’s based purely on hope — I might even say on wishful thinking — that it is not already too late to avoid anthropogenic global warming that will be catastrophic not only for human civilization but for the entire biosphere.
248, Hank Roberts: We know that won’t work. Why prefer it?
Because it might actually be implemented. The US is not going to adopt a policy via which coal consumption in the US is reduced to 0 in the next 20 years, and Chinese coal consumption will increase considerably in coming decades before beginning to decline in maybe 2040-2060.
As documented elsewhere, a considerable amount of the consumption of fossil fuel in the EU actually entailed the transfer of manufacture to China, with a net increase rather than decrease of global CO2 production. This was an unplanned consequence of attempting a rapid response instead of a decades-long response where only a decades-long response could possibly be carried out.
Incidentally, opponents of the “approved” Ivanpah Valley solar power project have taken the project to court:
If they are successful, the whole facility will not come on line in time to contribute to California’s renewable portfolio standard by the 2012 deadline written into AB32. Expect analogous suits against the other Mohave Desert projects.
255, Didactylos: A nice straw-man. However, these problems are not changing names for a single problem, they are nearly all genuine, challenging problems that face us today. And the “cooling” bit is just typical denier nonsense that SM slipped in to see if we would notice. Good grief.
Perhaps you are correct. However, the AGW proponents have taken a serious setback in the public policy debates over the last year, and you might want to consider a rhetorical strategy better than “Everyone who disagrees with us is greedy, shills for the fossil fuel companies, stupid, ignorant, or all of them combined.” The one consistent message from John Holdren over 4 decades has been the forecast of doom.
As to ocean acidification, note that I responded to a suggestion that ocean acidification was more urgent than global warming. But as long as the topic of ocean temperature has been re-introduced, recall that Caribbean Ocean corals suffered a big kill in last winter’s unusual cold.
Dan H. says, “The effects of atmospheric CO2 are reversible. If the burning of all carbon-based fuels were to cease immediately, the atmospheric CO2 level would degrade to pre-industrial levels. How long this would take is disputed.”
Whiskey. Tango. Foxtrot. Interogative! Where in the hell did you get that? CO2 persists in the atmosphere for thousands of years.
258, CM: Projections of the saturation levels of aragonite (a metastable form of calcium carbonate used by many marine organisms) indicate that calcification rates in warm-water corals may decrease by 30% over the next century (Gattuso et al., 1998; Langdon and Atkinson, 2005). By the middle of this century, calcification rates of many warm-water
corals are expected to drop to the point that they will be outpaced by erosion (Erez, 2008; Gattuso, 2008a; Manzello et al., 2008, Langdon et al., 2008), which would have serious impacts on coral reef ecosystems. Cold-water corals, which are found in deep waters, may also be in danger. These corals serve as habitat for many commercial fish stocks, and today virtually all of them are bathed in waters that are supersaturated with respect to aragonite. Yet by 2100, it is projected under the IS92a scenario that about 70% of these cold-water corals will be exposed to waters that are undersaturated with respect to aragonite, which would be chemically corrosive to their skeletal material (Guinotte et al., 2006)…
From that it seems to me that a multi-decade strategy to reduce CO2 would be appropriate.
SM 261: The US is not going to adopt a policy via which coal consumption in the US is reduced to 0 in the next 20 years, and Chinese coal consumption will increase considerably in coming decades before beginning to decline in maybe 2040-2060.
I get that. And it is scientifically true. But I call it a contriving manipulation of ordinary language that most people understand.
After you get done explaining that ‘acidification’ does not mean that something becomes ‘acidic’, like the very vinegar used to dissolve chalk, many, if not most, people go away mumbling that they are dealing with nitwit environmentalists. And then they get on the campaign to cut off the funding for any reasonable action.
Take a look at the projections for the next election.
\CO2 concentrations would start to fall immediately (IF EMMISSIONS WERE HALTED TODAY] since the ocean and terrestrial biosphere would continue to absorb more carbon than they release as long as the CO2 level in the atmosphere is higher than pre-industrial levels (approximately). And subsequent temperatures (depending slightly on the model you are using) would either be flat or slightly decreasing. With this definition then, there is no climate change commitment because of climate inertia. Instead, the reason for the likely continuation of the warming is that we can’t get to zero emissions any time soon because of societal, economic or technological inertia.\
Comment by John E. Pearson — 25 Oct 2010 @ 4:25 PM
“As documented elsewhere, a considerable amount of the consumption of fossil fuel in the EU actually entailed the transfer of manufacture to China, with a net increase rather than decrease of global CO2 production. This was an unplanned consequence of attempting a rapid response instead of a decades-long response where only a decades-long response could possibly be carried out.” – Septic Matthew
Complete tosh of course. The transfer of manufacturing from the EU to China and elsewhere had nothing whatever to do with attempts to reduce CO2 emissions (by far the biggest contributor to which has been a shift from coal-fired to gas-fired power stations), and everything to do with lower labour costs.
SM, I wonder where this is coming from: “As documented elsewhere, a considerable amount of the consumption of fossil fuel in the EU actually entailed the transfer of manufacture to China … This was an unplanned consequence of attempting a rapid response instead of a decades-long response”
Where is this “elsewhere”? I would like to know who is actively disseminating these notions you are uncritically propagating.
Comment by Anonymous Coward — 25 Oct 2010 @ 4:47 PM
You’re right: the ocean’s not a jar of vinegar, and calcifying marine organisms are not a piece of chalk. That’s why it matters and why we shouldn’t play word games.
But do you still insist that “pH reduction from 8.1 to 7.7 or 7.3 isn’t (…) going to dissolve (…) animal skeletons”?
#261: I’d expect an eighth-grader to understand that the illustration of a principle need not entail the literal recreation of natural conditions. As for “acidification,” surely that’s the appropriate term for reducing pH by adding carbonic acid (and surely not too strong for a process that will ultimately corrode and dissolve the shells of living organisms). CaCO3 dissolves in water that is undersaturated with carbonate; carbonate saturation drops with added CO2, and parts of the world’s oceans are undersaturated at a pH well above 7.
#267: “As I recall, the UN IPCC4 said …”
Coral reefs will also be affected by rising atmospheric CO2 concentrations (…) resulting in declining calcification. Experiments at expected aragonite concentrations demonstrated a reduction in coral calcification (…), coral skeleton weakening (…) and strong temperature dependence (…). Oceanic pH projections decrease at a greater rate and to a lower level than experienced over the past 20 million years (…). Doubling CO2 will reduce calcification in aragonitic corals by 20%-60% (…). By 2070 many reefs could reach critical aragonite saturation states (…), resulting in reduced coral cover and greater erosion of reef frameworks (…). (AR4 WG2 Box 4.4)
#271: Ah, so you do get it, but will still continue to quibble over terminology you know is correct? “Contriving manipulation of language” indeed.
Shift of manufacturing to China is driven by total production costs which include both labor costs and energy costs. There is also a factor of regulatory unfriendliness that puts a chill on expansion plans in the manufacturing sector. These added factors are particularly relevant in the heavy manufacturing sector, and no big deal at all for the software world.
So it is hard to make concise and completely accurate statements.
Some of us think that our loss of manufacturing jobs has and will continue to doom hopes of economic recovery. That opinion is supported almost daily with the unemployment reports, including all those not even counted.
Re 271 Jim Bullis – I get that communicating is important, but should we really alter correct terminology to make it sound less alarmist? And how many people who don’t understand what acidification is refering to have even ever heard of acidification. Given that a lot of people don’t know the implications of reducing the pH just to being closer to neutral, it might be considered whitewashing to use less language that is less alarmist than the technical words now used. I know some people here acid and picture liquids burning through metals, but that shouldn’t mean that a person correctly identifying orange juice as acidic should be sued by Florida. When people think global warming is caused by the ozone hole and wind turbines use more energy than they produce and that solar power plant albedo is a serious issue to global climate, I think there’s bigger fish to fry. And the people who talk about nitwit environmentalists – I like the idea of educating and I hope it works to some great extent, but ultimately we’re just going to have to steamroll over some of those people.
European issues are somewhat different from those in the USA. There the natural gas supply is substantially under control of Russia, and the cost is easily manipulated. Shifting from coal to natural gas is an action that comes with peril, since the planned costs may not be reliable numbers. Further, the coal supplies in Europe are not as abundant as they are in the USA. I have suggested elsewhere that the real reason European governments have gone along with the renewable energy schemes is that they do not want to be overly dependent on Russian whims.
I would not like to leave this with the misunderstanding that natural gas supplies in the USA are immune from manipulation effects. And the so-called abundance of natural gas in the USA is highly dependent on the price point on which reserves are calculated. We should anticipate that our reserves will shrink dramatically for each year that the price runs around $4 per MMBTU; and then the price will go up significantly. These kinds of things send chills down the spines of power producers. (Check out Calpine history, and PG&E as well.)
Gardeners and farmers talk about “acidifying” soils. Sometimes it’s a problem, sometimes it’s done deliberately to improve the health of certain plants. But “acidify” is the term used -regardless- of the current pH of the soil. It’s all about direction.
I am trying to make the point about overstating things, which is how I see it when Al Gore said that the oceans were becoming ‘acidic’ in Congressional testimony. And similarly, when Ms. Lubchenko demonstrated vinegar dissolving chalk, and I suspect she extracted the ascetic acid from the vinegar so the demonstration would go quickly.
Forgive the absolutely true anecdote, but I was in third grade, when the Sunday school brought in a woman to demonstrate the dangers of nicotine. She had a sparrow in a cage and an apparatus to smoke cigarettes to capture nicotine and tar in a small reservoir which she transferred to a hypoderminc needle. The needle was longer than the sparrow, and the tar was sufficient to displace the entire bird’s blood supply. Sure enough, on injection the bird died. Most of the class immediately swore off smoking forever. You guessed it; not me. I went out and tried to find my first cigarette to smoke. I suspect most of the class did the same after they had a moment to get over the shock.
This is how I see the Lubchenko acid demonstration.
“But all calcite shelled creatures capture CO2 and eventually that becomes sand. This mechanism does not seem to get adequate recognition.”
That mechanism doesn’t get recognition because you have the reaction reversed. Calcification is a source of CO2. The equation is 2HCO3 + Ca–> CaCO3 + CO2 + H2O
It’s the weathering of carbonates that consumes CO2, but that doesn’t do the animals who build their skeletons out of carbonates any favors.
“But as long as the topic of ocean temperature has been re-introduced, recall that Caribbean Ocean corals suffered a big kill in last winter’s unusual cold.”
No. Parts of Florida and the Bahamas had coral die-offs in areas that have been historically inimical to coral growth. There are periodic kills in those areas due to temperature and salinity variations on the shallow banks adjacent to the reefs. The mortality during that event was also mostly limited to gorgonians rather than reef-building corals.
On the other hand, we’re currently experiencing one of, if not the worst high-temperature bleaching event in the Caribbean.
Comment by David B. Benson — 25 Oct 2010 @ 9:48 PM
274, Anonymous Coward: SM, I wonder where this is coming from:
Since ratifying the Kyoto Treaty, EU nations have granted green credits to EU companies that finance industrial development in China (it’s a “developing” nation, exempt under KT from CO2 controls.) I’ll try to find a good survey soon.
Comment by Septic Matthew — 25 Oct 2010 @ 10:18 PM
you left something out SM: EU nations have granted green credits to EU companies that financeCLEANindustrial development in China
Jim Bullis, Miastrada Co. — 25 October 2010 @ 6:12 PM
“Some of us think that our loss of manufacturing jobs has and will continue to doom hopes of economic recovery. That opinion is supported almost daily with the unemployment reports, including all those not even counted.”
In its October industry survey, the National Association of Business Economics said Monday that employment conditions improved in the third quarter to the highest level since the start of the 2008-2009 recession.”
Would those negative reports on the job market be coming from Faux News?
More variation: Tropospheric ozone and the land-carbon sink
Certain ozone rants in the comments here lately sent me browsing a bit, and I came across this in Nature: Sitch et al., 2007, “Indirect radiative forcing of climate change through ozone effects on the land-carbon sink”, doi:10.1038/nature06059:
…Here we estimate the impact of projected changes in ozone levels on the land-carbon sink, using a global land carbon cycle model modified to include the effect of ozone deposition on photosynthesis and to account for interactions between ozone and carbon dioxide through stomatal closure. (…) We suggest that the resulting indirect radiative forcing by ozone effects on plants could contribute more to global warming than the direct radiative forcing due to tropospheric ozone increases.
People’s standards of evidence really take a hike as soon as far as anything resembling economic issues is brought up, don’t they?
Take the logic of Jim Bullis’ 277 for instance:
-hypothesis: A will cause B
-observation: there’s some A here and some B there
-conclusion: A will cause B
And now SM’s 288:
-hypothesis: C caused D which caused A
-observation: oh, there’s some E!
So do you have a source for your original assertion, SM? I’d still like to see it.
Comment by Anonymous Coward — 26 Oct 2010 @ 5:31 AM
289, flxible: you left something out SM: EU nations have granted green credits to EU companies that finance CLEAN industrial development in China
about 25% of the money went to CLEAN industrial development, in the most recent years.
Comment by Septic Matthew — 26 Oct 2010 @ 12:43 PM
gavin, what’s your take on Judith Curry’s current acclaim, given her recent trashing by many of the RealClimate regulars?
[Response: Mostly very silly, and a classic example of how media frames (‘man bites dog’/’heresy/dogma’, ‘conflict’) distort discussion about substance. – gavin]
I hope people are still checking this page, because I have a few general questions about GCMs. I’m hoping that someone can either provide answers or directions to where my questions were already answered.
(1) First of all, Jason Luttrell claimed, “Anything quantifiable must be based on a model.” I replied that he had it exactly backwards, and I suggested that counting spoons in a kitchen drawer didn’t require a model, but models require quantifiable inputs.
Luttrell responded, “Wrong again, your spoons are ‘modeled’ by the numerical system you use to count with. Counting theory is based on the idea that a particular instance of a ‘type’ of phenomonon [sic] can be equated to 1, or some other number of units.”
So I inquired, “If I want to count the number of angels dancing on the head of a pin, do I need a model?”
Has this line of attack against AGW been previously deconstructed and debunked? I’m not a mathematician, and I have no patience for this kind of nitpicking.
[Response: It is nit-picking, and is presumably riffing off a statement in which ‘model’ was used loosely, but, he is in fact right. No piece of data is worth anything without some kind of model for what it represents. What use is a voltage reading? or the height of a column of mercury? This is of course, nothing to do with ‘climate models’ (which I assume was your jumping off point). – gavin]
(2) Luttrell claimed, “Most of the evidence for global warming only confirms the predictions of the models, rather than the models themselves.”
Is there any difference?
[Response: This is a difference without a distinction. In fact, successful predictions validate the assumptions from which a model was built. – gavin]
And FWIW, Luttrell also opined, “The assumptions required to create a model that is simple enough to be useful, even with todays computers, are too great to provide a statistical level of confidence that makes any medium to long term predictions plausible.”
[Response: This is nonsense. Climate models today are useful, have proved skillful and do have plausible long term predictions. – gavin]
The comments will be open on the NPR site for a few more days. I’ll also keep checking back here. Thanks.
Don’t know where else to put this, but hope the usual kind people will take a moment to ID the sad problems with some recent material on DotEarth promoting the Breakthrough Institute and the American Enterprise Institute with what appears to be some successful greenwashing. There seems to be a trend towards credentialing some dubious stuff. I lack the expertise to weigh in properly, and am also a bit tied up for even my amateur efforts to keep science clean and clear against the propaganda avalanche.
I’m all for solutions, but these guys have a more than dubious history.
Andy appears to have done some quite good stuff surrounding this, but I do wish I never had to see promotion of the doubtful “middle” ever again.
I had assumed the economic conditions in this country were more a matter of common knowledge, and in particular, consistent with the way I see things. That is of course, not scientific.
I have no idea what Faux news says.
But the article you linked to says, “Meanwhile, a majority of respondents believe current regulatory policies and federal taxes will be a drag on business next year.” It also says a lot of other stuff which we all can interpret as we choose.
I just cut a rambling explanation of economics as I see it, because I see not much hope of explaining my view of such matters in a comment box.
But concisely, it was told to me elsewhere that (long ago) House Speaker Jim Wright told Ronald Reagan, “You can’t have an economy washing each other’s cars and delivering pizza. That is Voodoo Economics with Depression Sauce.” If you see things like Jim Wright and I do, the rest of it is not so hard to understand. If not, forget what I said, but don’t be too surprised about business resistance to proposals that add significantly to cost of operations.
My 277 fits not at all into your simple logical format.
More careful reading is needed to see that no attempt was being made to establish a science of economics.
About the best I could hope for is to declare as an axiom that cost of manufacturing includes cost of energy, and that cost of manufacturing determines competitiveness, and competitiveness enables increased sales, and increased sales enables expansion of employment. If we can’t go with that, there is no point in having a discussion.
Dear Jim Bullis,
And the macroeconomic evidence for “cost of manufacturing determines competitiveness” in the real world is what exactly? For “increased sales enables expansion of employment” you can at least find macro evidence for a correlation (if not for the causation you posit). Your original conclusion does not follow from your axioms by the way.
There’s no point in having a discussion alright. I noticed already. But there would be a point in substantiating these gratuitous assertions. If you’re unwilling, someone has to point out that you’re making it up as you go along. Denying the physics of the climate or the economics of mitigation has about the same effect. But one is easier to pull off on RC than the other…
Comment by Anonymous Coward — 26 Oct 2010 @ 5:40 PM
Comment by David B. Benson — 26 Oct 2010 @ 6:48 PM
If we’d done something like this 40 years ago in response to the Oil crisis, it might have worked. Even if we had done it 20 years ago when the science of CO2 and the atmosphere became clear, it would have stood a chance.
Now after 30 years of delay and complacency, we are playing catchup, and even with the most remarkable innovation, it will take significant cuts and severe conservation to buy enough time to deploy any new technologies. The initiative seems to be equivalent of a politician saying, “We’ve appointed a committee to study that!” It has more to do with camouflaging inaction than taking action.
What are the implications for global average forcings?
We could start by dividing the SW forcings by 2.4 – the angle of the sun also matters (how much is absorbed in the stratosphere vs troposphere vs surface, how much is backscattered by air) but an intermediate angle is used in that paper so it might be close to the global average effect.)
Then there is the issue of using MLS conditions, which would affect the LW forcings.
After that, their is the issue of clouds, which affects both SW and LW forcings.
Clouds, especially low level clouds, will tend to reduce the surface LW forcing changes. Clouds, especially high level clouds, will tend to reduce the upper level LW forcing changes. This is because clouds block some of the radiation that would otherwise be available to block, and replace that with some radiation that would otherwise have come from gases. Clouds would reduce the stratospheric cooling that could occur by reducing the upward LW flux at the tropopause; thus, clouds should make the TOM and 200 mb forcings more similar.
Clouds reduce solar heating that would otherwise occur beneath them, but do themselves absorb some solar radiation. They also reflect some radiation back upward, which can increase solar heating above them somewhat (if the relevant wavelengths were not already completely absorbed in the first pass.) Offhand I don’t know the details of how cloud albedo is distributed over wavelength, but based on liquid water absorption coefficients, I’d expect the albedo to drop off going away from blue towards either UV or red and solar-IR. Thus, the reflection by clouds may have less ability to enhance upper-level solar heating of ozone (UV) or CO2 (near-IR) than otherwise. But generally, clouds will reduce the SW forcings at the surface, but perhaps increase the forcings at TOM and 200 mb by giving gases more room to reduce the albedo of the planet.
Clouds would reduce the stratospheric cooling that could occur by reducing the upward LW flux at the tropopause; thus, clouds should make the TOM and 200 mb forcings more similar.
correction – they reduce the upward LW flux at the tropopause, and reduce the room for other agents to do the same. But there are other issues – for example, given an approximate description of the CO2 absorption band, I think clouds shouldn’t change the instantaneous LW forcing on the stratosphere forcing from a doubling of CO2.
I wonder if part of the answer to the faint young sun paradox is a lack of oxygen in the atmosphere led to increased levels of CH3 radicals from the reaction CH4 + OH -> CH3 + H2O, and slowed the removal of Abiogenic CH4 from the early atmosphere.
Another part of the answer might also be that the climate sensitivity is higher than 3 degrees per doubling of CO2, and/or departs from a logarithmic response at higher concentrations.
“something on the order of 1000 ppm” – or 100 ppm, but much higher than now, anyway. Remember though that when there is so much CH4, the forcing per additional mole of CH4 is reduced a lot; actually, CO2 is more powerful than CH4 when in equal amounts (consequences for possibility of initiating a snowball by release of oxygen – would occur more readily if there is enough but not too much CH4).
Here is a simple question for your panel. I measured the evaporation rate of my 50m2 swimming pool and I was amazed to find that 2500 litres evaporated in one week (no discharge, clear blue skies all week, 31-34 C max. in the day,temp. of pool 25-26C.)
Compare this to the tank of patrol (gas) I use in one month. (40 litres)
What worries me is that to survive, humans are creating many shallow
dams and waters that are easily prone for such high evaporation. (higher temp= >> higher vapor pressure)
e.g. for irrigation, for consumption, for hydro power (electricity-
China!), for recreation and for land creation.
Add to this all the water vapor from burning fuels (including jet &
rocket fuel, the countless water cooling plants in every factory (including those for nuclear energy),the boiling, cooking, bathing etc. etc.
Now if I look at the fact that H2O accounts for most of the greenhouse
effect, why would you think that the odd 100 ppm’s of CO2 that were added to the atmosphere since 1960 are much more relevant than all that extra water vapor being added to the atmosphere due to human activities?
Even if it (the water vapor) ultimately does condense, the heat (40.7
kJ per mole = 18 g) has to go somewhere, my guess is ca. 50% to earth and 50% to space.
Would that “Land Hurricane” that just went by the US midwest qualify as a result of GW? Hurricanes are supposed to be ocean beasts.
Comment by Edward Greisch — 27 Oct 2010 @ 10:48 AM
HenryP, let me answer your question @ #310 with another: given that more than 70% of the planet’s surface is ocean, and that terrestrial plants supply vast amounts of water vapor to the atmosphere via transpiration, why would you assume that the human water vapor contributions you mention are in fact large enough to make any difference whatsoever?
Notice that the only quantitative analysis you have done was relative to your swimming pool. . .
HenryP 310: why would you think that the odd 100 ppm’s of CO2 that were added to the atmosphere since 1960 are much more relevant than all that extra water vapor being added to the atmosphere due to human activities?
BPL: Because a pulse of water vapor stays in the atmosphere nine days, but a pulse of CO2 stays in the atmosphere 200 years.
Models can be enormously sophisticated, but they are like clanging brass if they have not correctly taken into account the natural processes. Thus, our BPL has come to think that there is no way for CO2 to be taken out of the atmosphere. Forests work; clams, coral, and barnacles work. And oceans take up heat in a way that is only vaguely understood.
There is of course, a flow in and a flow out. And of course the models take this into account. And it appears there has been a significant deficit running for some time. That has to be fixed. Let’s get to it.
I am a little puzzled by what happened in the 1800s as the heavily forested North America was converted into farmland. It would seem that should have dumped a vast load of CO2 into the atmosphere. Is that explained in the record?
Henry@KevinMcKinney (abt 310)
Don’t change the question. The question was, given that most of the greenhouse effect is caused by water vapor & clouds, : why would you think that the odd 100 ppm’s of CO2 that were added to the atmosphere since 1960 are much more relevant than all that extra water vapor being added to the atmosphere due to human activities?
if you consider that 70% of earth is water, than you are not far away from the truth: more warming (of water) brings more vapor, which bring more clouds which brings more cooling (due to deflection of sunlight).So….?
[Response: You may not be aware but this is one of the most often rebunked confusions. Try reading this older post on “Water vapour: feedback or forcing?“. The bottom line is that our emissions of water vapour (via irrigation, combustion, river management) are sometimes noticeable in local climate, but don’t affect global climate much at all (because the amount of water vapour adjusts very quickly because of the large amount of condensation and evaporation). CO2 increases on the other hand persist much longer and provide persistent forcing globally. If you are getting these ideas from a particular source, you might want to reassess its credibility. – gavin]
Thanks, but the truth is: there really is no one that can tell me exactly, from actual tests + measurements, what the net effect is of the warming and cooling caused by CO2. If you want those answers than you land up with incomprehensible mathematics and then I ask: but what about the overlaps in the 14-15 um band of O2 and water vapor? Although the absorption of O2 in the 14-15 um is fairly weak, O2 is 21% whereas CO2 is only 0,04% in the air. Water also absorps at 14-15. So how is this all separated? Well, in that case, then there is just so much more incomprehensible mathematics. And of course, nobody has made any errors. But the truth is: there are no measurements, not one actual physical test. I think Newton would turn in his grave if he heard all those wild warming theories caused by the CO2.
I say more carbon dioxide is good. I base this on my simple observation that without CO2 there would not be any food.
[Response: Brilliant logic. Let’s apply it to some other circumstances: Since water is necessary for life, more water is good and so no-one can drown. Hmmm… what about vitamins are necessary for life, and so more are good, and no-one ever has a problem from vitamin overdoses…. Oh. Must be something wrong there, I wonder what? Clue: ‘incomprehensible mathematics’ might only be incomprehensible to you. (PS. comments that just parrot long debunked inanities are not interesting. Please try another tack). – gavin]
HenryP: to put it another way, we simply cannot add water vapour to the atmosphere. If we try, it just rains out. Even in the short term, we cannot increase the amount of water in the atmosphere.
The actual amount of water vapour is governed by temperature.
Land use changes may, it is true, change rates of evaporation and precipitation at a regional scale. But evaporation just isn’t something that worries people – compared to the vast extents of the oceans, a swimming pool or lake here and there is completely negligible. But that extra 100 ppmv of CO2 is in every cubic metre of the atmosphere. From ground level, to the top of the atmosphere, many kilometres above us.
“the heat (40.7 kJ per mole = 18 g) has to go somewhere”
Energy has to come from somewhere. And it is true, processes releasing water vapour into the atmosphere do often involve waste heat. But by any measure, the waste heat is negligible compared to the main energy input into the earth system – the sun.
Now, back to the subject of humidity. Relative humidity may not be changing, but absolute humidity is changing. It is increasing. This is because humidity is governed by temperature, and the fact that absolute humidity is increasing is in independent proof that the global temperature is increasing. And yes, that extra water vapour does alter the greenhouse effect. This is (I think) the most important feedback process that amplifies the effect of the increased CO2.
I’m sorry I missed the comment lock on Jim’s fine post about bark beetles, and I want to thank him for it. It truly is a sad tale, but one that must be told. I look forward to the next installment. Thanks, Jim.
[de-duplicating] Let’s look at Jim Bullis’ latest quantitatively. I’m not any kind of specia-list so my numbers and conversions might be way off, I might be making a serious mistake and so on… so please correct me!
I estimate about +13 ppm for the 19th century (eyeballing Law Dome). Assuming a 50% airborne fraction, this gets us about 55GtC of emissions needed to explain the change. But some of the 19th century increase in atmospheric CO2 is probably a temperature feedback. So let’s assume 45 GtC of emissions. This number is probably way off but hopefully it’s in the ballpark.
Fossil fuel burning is estimated at 12 GtC for the 19th century (CDIAC referencing economic historians). So that leaves 33 GtC of emissions to be explained.
I don’t know where to find actual data about this but I’ll put my high estimate of aera deforested in the United States and Canada during the 19th century at 2.5 millions of square kilometers. Assuming an average of 140 tC emmitted per hectare, that would imply 35 GtC of emissions.
So I’m not seeing any discrepancy so far. No doubt that was dumb luck and that proper estimates would be much further apart than my 33 and 35 GtC, if only because my deforestation estimate is implausibly high. There was also lots of deforestation outside North America. And I’m probably missing something big. But, at first approximation, nothing extraordinary seems to be begging for an explanation.
Could you please provide quantitative ballpark estimates yourself next time, Jim Bullis? This way people will at least be able to point out where you have erred in overstimating the importance of forests. I’m sure you’d rather not bother considering how much you post but RC is supposed to be a halfway serious venue.
Comment by Anonymous Coward — 27 Oct 2010 @ 3:28 PM
JB 314: our BPL has come to think that there is no way for CO2 to be taken out of the atmosphere.
BPL: How does that follow from anything I said? I’m familiar with the short-term carbon cycle, thank you very much.
gavin wrote: “… comments that just parrot long debunked inanities are not interesting”
Not to mention depressing.
I mean, global warming itself is mighty depressing.
But the proclivity of so many people to seize upon and regurgitate “long debunked” idiotic drivel, often with haughty, sneering arrogance and condescension that seems to exceed even their ignorance and gullibility, is also pretty depressing.
Henry P., given that your name links to a megachurch (my wife calls them Christmart), are you even real? Good Lord, man, you are utterly ignorant of how the climate works, cannot construct a logical argument and seem to be unable to learn. Pray, why the hell should we waste our time with you.
Fact: The greenhouse effect accounts for about 33 Kelvins of Earth’s temperature.
Fact: CO2 is responsible for about 25% of that amount.
Fact: CO2 is increasing.
Fact: It is warming.
Fact: CO2 sensitivity is more than 2 degrees per doubling with 90% confidence
So, while YOU may want to bet the future of humanity on a 20:1 longshot, the actual scientists among us have a problem with that.
Archer, David, and Victor Brovkin. 2008. The millennial atmospheric lifetime of anthropogenic CO2. Climatic Change 90, no. 3 (6): 283-297.
However, despite all these differences, the models agree that the substantial fraction of projected CO2 emissions will stay in the atmosphere for millennia, and a part of fossil fuel CO2 will remain in atmosphere for many thousands of years.
Archer, David, and Victor Brovkin. 2008. The millennial atmospheric lifetime of anthropogenic CO2. Climatic Change 90, no. 3 (6): 283-297. doi:10.1007/s10584-008-9413-1.
Nowhere in these model results or in the published literature is there any reason to conclude that the effects of CO2 release will be substantially confined to just a few centuries. In contrast, generally accepted modern understanding of the global carbon cycle indicates that climate effects of CO2 releases to the atmosphere will persist for tens, if not hundreds, of thousands of years into the future.
Solomon, S., G. K Plattner, R. Knutti, and P. Friedlingstein. 2009. Irreversible climate change due to carbon dioxide emissions. Proceedings of the National Academy of Sciences 106, no. 6: 1704.
This paper shows that the climate change that takes place due to increases in carbon dioxide concentration is largely irreversible for 1,000 years after emissions stop.
Nowhere in these model results or in the published literature is there any reason to conclude that the effects of CO2 release will be substantially confined to just a few centuries. In contrast, generally accepted modern understanding of the global carbon cycle indicates that climate effects of CO2 releases to the atmosphere will persist for tens, if not hundreds, of thousands of years into the future.
Archer, D., M. Eby, V. Brovkin, A. Ridgwell, L. Cao, U. Mikolajewicz, K. Caldeira, et al. 2009. Atmospheric lifetime of fossil fuel carbon dioxide. Annual Review of Earth and Planetary Sciences 37: 117–134.
We can neither confirm nor deny that the land hurricane is the result of GW. Although uncommon, these storms can occur this time of year, and in Michigan are commonly referred to as the “Gales of November,” for the havoc they can wreck on shipping and boating.
Re ‘Land Hurricane’ – first, it deserves mentioning that the pressure drop equivalent to something you might expect from a cat 5 hurricane (right?) doesn’t produce the same wind speeds in an extratropical cyclone – even over water – for two reasons:
1. stronger coriolis effect – you need less wind speed to have the coriolis force balance the pressure gradient force (although there’s also the centrifugal force to consider)
2. the pressure anomaly is spread over a larger area; reduced pressure gradient
Anyway, I’m wondering how the latent heating compares to other such storms – that is something that you’d expect to increase in global warming, at least in general.
Also, was this a case of upstream development – drawing energy off the earlier storm that hit the Pacific Northwest?
Thanks, but the truth is: there really is no one that can tell me exactly, from actual tests + measurements, what the net effect is of the warming and cooling caused by CO2. Exactly? No, of course not. Approximately? Yes!
If you want those answers than you land up with incomprehensible mathematics
Don’t confuse number crunching with complicated logic. The math is quite simple. It’s just that there is a tremendous amount of data that the formulas must be applied to, with numerical integration, to actually get an accurate answer. But qualitatively, it’s mostly quite intuitive. Imagine walking through a fog – what can you see? Now make the fog thicker. Now imagine that you’re seeing at wavelengths where the fog is not scattering so much as it is emitting incandescently and absorbing radiation, and there’s a temperature gradient so that it emits more or less from different places; make the fog thinner or thicker (make it thicker and three things happen: greater emission in a given volume, greater absorption in a given volume of whatever radiation is available, and the distances that photons travel between emission and absorption is shortenned) – what do you see? What you see is what you get (aside from the distinction between flux/area and intensity).
Look up these two terms: Schwarchild’s equation (PS the one about radiation – I’m not sure offhand but some stuff about black holes may go by the same name) and ‘emission weighting function’.
and then I ask: but what about the overlaps in the 14-15 um band of O2 and water vapor? Although the absorption of O2 in the 14-15 um is fairly weak, O2 is 21% whereas CO2 is only 0,04% in the air. Water also absorps at 14-15. So how is this all separated? Well, in that case, then there is just so much more incomprehensible mathematics.
The total amount of O2 in the atmosphere is nearly transparent to LW radiation; this takes into account the 21 % figure.
Water vapor is concentrated in the lower atmosphere; much CO2 effectively sits on top of water vapor and has room to affect the LW fluxes at the tropopause and at TOA even at many wavelengths where water vapor blocks almost all radiation from the surface; water vapor is essentially transparent in the stratosphere over a range of wavelengths containing the CO2 band.
You don’t need to seperate it out unless that’s what you want to know. To seperate, just do the calculations with various things added or removed; the overlaps between different greenhouse agents mean that the last greenhouse agent added generally could make a bigger difference on it’s own if it was the first agent added, etc. This doesn’t really concern the effects of doubling CO2, in that the calculation of the forcing of doubling CO2 is done with everything else present if the purpose is to find what it is for the Earth as it is (you can approximate this by using some representative atmospheric column, but in principle the accurate answer for global average forcing is found by calculated the forcing for all places and times in the climatic state and then finding the global average).
But the truth is: there are no measurements, not one actual physical test.
False – optical properties (of at least gases) can be measured in the laboratory, and the calculated effects on radiation through or from the atmosphere can be confirmed by measurements of radiant fluxes. It is even possible to see the reduction in OLR caused by CO2 increases using a satellite.
Thanks for the references, but somehow there is something that seems wrong here.
Simplifying, if all CO2 emissions stop today, and trees continue to grow, the organic compounds that are wood will be produced by taking up the present day supply of CO2 according to the rate at which trees grow in mass. Surely, if forests were to cover the earth to the extent that their mass matches the mass of fossil fuels that was burned over the last 200 years, that would have to soak up the CO2 that came from these fossil fuels.
That would take a long time, but way short of forever. Do the models assume that no permanent standing forests would be possible?
Or is this all dependent on something else happening, whereby more sources of CO2 develop, such as with permafrost or such, and the additional CO2 mixes with and reduces the rate that the original CO2 gets taken in by trees? If this is what we are talking about, the whole thing depends on timing of solutions, but the statement about how long CO2 lasts is a conditional statement, not an absolute one.
Edward@311 and Dan H.@329, Please! The so-called land hurricane is weather. Period. It is an extreme event and so by definition, we do not know what the probability distribution for such events looks like from past data. Look for TRENDS over TIME. There is plenty of evidence that the climate is changing. Getting 3 hundred year floods in a decade provides more evidence than a single extreme event.
JB: But you said, ” – – a pulse of CO2 stays in the atmosphere 200 years.”
Are you meaning that the marginal impact will carry forward?
BPL: Read my lips. A single molecule of CO2 may only stay in the air 5-10 years, but there are molecules coming in as well as going out. Raise the CO2 level and it will still be significantly higher 200 years later. In fact, for about a sixth of it there’s a long tail out to 100,000 years.
The water cycle lasts about 9 days.
CO2 matters, H2O does not, except as an amplifier.
Forests can not realitically soak up all the carbon from the fossil fuels that were burned, especially considering that more keeps getting burned. Take a look at the numbers for yourself.
But forests don’t need to soak it all up to bring atmospheric CO2 levels way down because a substantial fraction of the carbon emitted by fossil fuels and industrial deforestation has effectively been sequestrated in the oceans and will stay there for a long time. Because there are several flows and reservoirs, you really have to look at this quantitatively to figure out what the net effect of a scenario would be. Guesstimates are only trivial when a flow totally dwarfs the others such as current emissions from fossil fuels.
I suspect that, assuming no large carbon releases from warming, a drop in fossil emissions to zero followed by massive reforestation would draw down most of the excess of atmospheric CO2 over the pre-industrial baseline in a matter of decades as CO2 continues to flow into the oceans as well as the forests but that CO2 levels would then remain elevated for millenia in spite of continued growth of the forests because the net flow between the oceans and the atmosphere would reverse. But it of course depends on how much CO2 the new forests would draw (and how quickly). Do you have any numbers you care to propose to illustrate your scenario?
Of course the statements about the airborne fraction of CO2 emissions over time are conditional but please stop pretending that reforestation on the scale needed to offset the burning of fossil fuels is in any way realistic without hypothetical developments in genetic engineering, geoengineering or some other magitech. If you think your “solutions” hold water, post your numbers!
Comment by Anonymous Coward — 28 Oct 2010 @ 9:24 AM
Relying on my memory is a dubious proposition but that’s all I have time for right now. Estimates of the sink potential of earth’s forests are comparable (same order of magnitude at least) to what would need to be removed from the atmosphere to go from current levels to 350 ppm. I don’t think such estimates account for degradation of forest sink capacity due to continuing climate change (an example of which Jim Bouldin described in his Seeing Red post), so only an unrealistically optimistic scenario would provide for forests even drawing down the current excess, leaving no room for forests to “offset” continuing fossil fuel emissions.
And this, from the Archer et al. piece, explains why so many people are so confused about how long CO2 lasts — the policymaker summary screwed up the explanation of this in the 1990 IPCC Report.
—- excerpt follows—-
“The 1990 Intergovernmental Panel on Climate Change (IPCC) report … carefully explained in the technical Chapter 1 … that, on human timescales, CO2 really has no sinks, it just equilibrates….
… the equilibration timescale is no indication of how long the climate impacts of CO2 release will last. But this distinction was confused in the “Policymaker Summary,” in which the timescale is referred to as an atmospheric lifetime, which is incorrect, and it is used as a rationale to limit the time horizon for calculating global warming potentials of greenhouse gases to 100 years, which is not logical either….
… “the door to misinterpretation had been left open. Others have and continue to walk through it.”
Subsequent IPCC reports in 1995 and 2001 compounded the mistake …. The result has been an erroneous conclusion, throughout much of the popular treatment of the issue of climate change, that global warming will be a century-timescale phenomenon. Simple thermodynamics of CO2 dissolved in seawater plus paleo-evidence, in particular the PETM event 55 Mya tell us otherwise.
—— end excerpt —-
HenryP said: \the truth is: there really is no one that can tell me exactly, from actual tests + measurements, what the net effect is of the warming and cooling caused by CO2.\
Before July 16 1945 there really wasn’t a soul on the planet who could tell you exactly from actual tests + measurements what would happen if one rapidly compressed a sphere of plutonium. All there was was a bunch of mathematics that you would have found incomprehensible. Only after July 16 1945 at 5:30 AM (MWT) was it known. At 5:29AM MWT no one had the foggiest idea what would happen. The US govt. spent a billion 1940’s dollars or thereabouts to perform the experiment. Wow. Don’t you think they were lucky?
Comment by John E. Pearson — 28 Oct 2010 @ 2:29 PM
Rick Brown, Hank Roberts, Barton Paul Levenson, Anon. Coward, and others
I can’t take time to respond appropriately, but I have read and appreciate your responses. Thanks.
In general, I don’t see much that addresses the big ‘what if?’ that I started with that was in reaction to the EPA discussions of ‘CCS’ for new coal fired power plants. Trying to keep the focus on ‘new’ and ‘standing’ forests rather than getting into the past, present, and future of existing forests, I had hoped to greatly simplify the discussion. However, I realize that there is much to be learned from all the work by others. But still, there seems to be no insurmountable obstacle to creating new forests to solve an expanding problem. It might be useful for trimming back CO2 from our existing power generation practices.
If it comes to nothing more, I at least, can see better than ever, the need for more attention to keeping our existing forests in good health.
But I still see potential in new forests, yes, with somewhat rearranged water distribution.
Please will someone correct me if I am wrong… but my present understanding is that forests (rainforest, at least) is surprisingly carbon-neutral. The carbon sources and sinks within the forest neatly balance on a small scale, and the idea of rainforests as the “lungs” of the planets has been largely discredited.
I probably read about this here, so if someone could remind me where and what I got wrong, that would be great.
Negative feedback in the cold: ice retreat produces new carbon sinks in Antarctica 15 SEP 2009
“we show that the loss of ice shelves and retreat of coastal glaciers around the Antarctic Peninsula in the last 50 years …. New annual productivity, as opposed to standing stock, amounts to 3.5 × 10[e]6 tonnes yr−1 of carbon, of which 6.9 × 10[e]5 tonnes yr−1 deposits to the seabed. By comparison the total aboveground biomasses of lowland American tropical rainforest is 160–435 tonnes ha−1. Around 50% of this is carbon. On this basis the carbon held in new biomass described here is roughly equivalent to 6000–17000 ha of tropical rainforest. As ice loss increases in polar regions this feedback will become stronger …”
So, rainforest is probably oxygen neutral, but there some serious question-marks over the carbon budget.
My gut feeling is that new forests will be a carbon sink, but long-established forests ought to be relatively neutral, simply because there is nowhere for the carbon to hide. The biomass is in equilibrium. The soil layer is very thin. Hank, the study you linked mentions a “large biomass increment” accounting for the carbon sink. That doesn’t sound like a forest in balance, or at least implies the observation period was too short to properly account for tree death.
Lindroth et al (1998) studying a boreal forest, found it was a carbon source, and were very surprised by this. Tan et al (2010) can’t get their methods to converge. Other studies also find a wide range of values.
I don’t think the book has been fully written on this yet.
I agree that the “lungs of the planet” is a misleading metaphor.
I fear that what follows is too long and at the same time too short to do the subject justice. Many people have told me that they found my paper on climate and forests to be helpful (www.defenders.org/climatechange/forests).
It may help to think of forests as leaky buckets (credit for this metaphor to Mark Harmon and Olga Krankina at Oregon State University), with photosynthesis providing carbon inputs, and losses through respiration, decomposition and fire represented by holes in the buckets. The relative balance of input and leaks will change over time and space.
In the absence of disturbance, a given forest stand can achieve the sort of balance you describe, where the photosynthetic inputs are balanced by losses to decay. Here in the Pacific Northwest this stage can take several hundred years to develop, with 500 metric tons or more of biomass (living and dead) per hectare. Forests elsewhere typically reach this stage in less time and with less biomass. Depending on factors such as precipitation and temperature, whether a stand’s net is positive, negative or neutral may vary from one year to the next.
At the landscape (or larger) scale it gets more complicated, but the idea of balance can still apply. Some stands will be young, perhaps regenerating after a fire, and losing more carbon to decay than the young trees are taking up via photosynthesis. Others will be in a maturing stage (perhaps 50 to 250 years here in the PNW) where gains significantly exceed losses. With a given disturbance regime (average frequency and severity of fire, for instance) a dynamic landscape will have characteristic amounts of stands of different ages, and characteristic average (not constant) carbon stores.
Worldwide, humans have reduced carbon stores both by eliminating many forests and causing others to store less carbon as a result of activities such as industrial timber management that increase both the frequency and severity of disturbance (logging, in this case), greatly reducing average carbon stores across forested landscapes. There may still be a dynamic balance, but the average carbon stores are less than they were and less than they could be. (Stores in wood products fall well short of making up the difference.)
In human-disturbed landscapes, reforestation and reductions in the frequency and intensity of disturbance can move things toward higher average carbon stores. While that transition is underway, “balance” would be lost — replaced by a trend of increasing stores — for a while, until a new balance is reached at higher average storage levels. This what people mean when they say that while the potential for forests to draw down atmospheric CO2 is considerable, it’s limited, or that the forest sink will “saturate.”
Disruption of forest dynamics as a result of climate change will be a substantial source of uncertainty.
Tropical rainforest is very different than temperate forest; I recall original tropical rainforest has thin soil, while temperate forest can keep accumulating deeper soil indefinitely — a shaded canopy limits underbrush growth and fires.
> what level of management would be required to tilt “long-established
> forests” into functioning carbon sinks?
A much lower final level of ‘management’ than currently being done, but a great deal of ‘management’ in removing brush and smaller trees, moving toward restoring the mature forest pattern of widely separated large trees and continuous shade that controls underbrush:
“One approach to restoring forests and reducing fire intensity is to use thinning and gap-creation to increase the proportion of large fire-resistant trees and encourage more shade-intolerant regeneration.”
Hank @ 354
I’m all for thinning and prescribed burning overly dense, dry, fire-prone forests to restore habitat, reduce drought stress and reduce fire risk. However, these activities also reduce forest carbon and the most reasonable general conclusion to draw from the research is that – in dry, fire-prone forests — they should be considered carbon-neutral. See for instance:
Mitchell, S.R., M.E. Harmon, and K.E.B. O’Connell, Forest fuel reduction alters fire severity and long-term carbon storage in three Pacific Northwest ecosystems. Ecological Applications, 2009. 19(3): p. 643-655.
You said, “Calcification is a source of CO2. The equation is 2HCO3 + Ca–> CaCO3 + CO2 + H2O”
To me it looks like 2HCO3 first exists due to combination of water and CO2 and oxygen. 2CO2 + H2O + O making your 2HCO3 would seem like a believable starting point. If this becomes 2HCO3 after combining with Ca, as you say, then it looks like we captured some CO2 and lost some CO2. In effect, this constitutes capture of CO2. And the longevity of limestone seems to suggest a good job of sequestering it gets done.
What would be the process for weathering of carbonates?
Rick, I agree — I’d assume the study area is a logging operation — they have the loggers leave either 50 or 30 percent canopy and then let their biologists compare. It’s not a study on how to maximize recovery of forest, it’s figuring out how many trees they can take out and what the longer term costs are. At least that’s something, to finally be looking at those costs.
Calcite is intricately tied to carbon dioxide in another way. Since many sea organisms such as corals, algae and diatoms make their shells out of calcite, they pull carbon dioxide from the sea water to accomplish this in a near reverse of the reaction above. This is fortuitous for us, as carbon dioxide has been found to be a green house gas and contributes to the so called “green house gas effect”. Environmentally then, calcite is very important and may have been quite important to the successful development of our planet in the past. By pulling carbon dioxide out of the sea water, this biological activity allows more of the carbon dioxide in the air to dissolve in the sea water and thus acts as a carbon dioxide filter for he planet. Environmentalists are now actively engaged in determining if this activity can be increase by human intervention to the point of warding off the “green house gas effect”. A significant amount of calcite precipitation in sea water is undoubtedly inorganic, but the exact amount that this contributes is not well known. Calcite and other carbonate minerals are very important minerals in the ocean ecosystems of the world.
Is this wrong?
[Response: Completely: CO2(aq)+CaCO3 + H2O < --> 2HCO3^- + Ca^2+ Look up ‘carbonate pump’ on wikipedia. -gavin]
You wish me to go off and talk about forests in the Sahara, because they are talking about ‘it’ there? The ‘idea’ so to speak has nothing to do with foresting the Sahara.
It is an old trick to try to reframe that which you don’t like in order to defeat it.
The key to the ‘idea’ is that we have a possibility of rearranging water distribution in North America. China does also have that option. I can not imagine anything in the area of the Sahara having any such possibility.
I realize it is awkward to think about some difficult environmental issues with regard to water distribution. But for those who see that global warming carries with it worse environmental problems, I suggest, maybe it is time for some new thinking.
You seem ok to be talking about management of existing forests. And you offer thoughts about how to do this.
Though I do not see much likelihood that existing forests can be made into a greater sink than they are, it certainly would be an interesting part of a forest project to establish a cadre of trained people who could maintain the existing forests, and learn about forest management in the process.
This skill set would then be applicable to the new forests that we might establish, and with skilled management, this could lead to permanent holding of ‘carbon’, as organic compounds.
from 1993, does mention SW forcing by CO2 – apparently the model average SW forcing is a 6 % difference to forcing at the tropopause level; I only skimmed this so I don’t know if that’s before or after stratospheric adjustment.
Hank’s quote from Peck et al refers to 0.0035 GtC of annual productivity of which about 20% might be sequestrated. That temporary drawdown is a drop in the bucket. Forests could easily beat that. Peck et al speculates that 0.01 GtC/yr might be sequestrated “eventually” after ice retreats further. The effect might be significant over the “thousands to hundreds of thousands of years” that the abstract considers… but not over the timescales we’re concerned about.
Jim Bullis, the unsurmountable obstacle to forests as a “solution” (as you say) is that they don’t fix nearly enough carbon. The consecreated word, I belive, is “wedge”.
If you’re going to do science-fiction (or rather engineering-fiction), consider ways in which biomass could be buried that would sequestrate the carbon for a long time. Then consider how much work would be required to bury adequate amounts of biomass in that way (and how much CO2 would be emitted in the process). Work the numbers and I don’t think you’ll find that burning coal for elecricity is worth it.
Comment by Anonymous Coward — 29 Oct 2010 @ 5:11 AM
Wisconsin logs that sank in the 1800s are perfectly good today, so burying wood may not be the only option:
I had thought about sinking logs in the oceans but not in the great lakes. Perhaps some other lakes would be suitable as well such as Baikal. There’s enough volume in the great lakes to sequestrate more carbon than we need but it couldn’t happen very fast and it would still require a huge amount of work.
Any forestry experts who could guess how fast the soils of the region might deplete if one were to systematically harvest lumber for burial/sinking? And how much of the region’s wood output would potentially be suitable for efficient transportation and sinking to begin with?
Comment by Anonymous Coward — 29 Oct 2010 @ 9:46 AM
The NAWAPA idea goes back to the 1950s, done in detail, it’s not new.
If you could point to a success in creating new forests on land that hasn’t supported them, you’d be adding something novel to the NAWAPA idea — but you’d have to justify taking the water, which they have other plans for.
It looks like there used to be such an article since I found a link to it inside Wikipedia, but it’s no longer there.
[Response: Hmm… well it should be. Try this textbook instead. – gavin]
Comment by NoPreview NoName — 29 Oct 2010 @ 12:10 PM
263, Anonymous Coward: The consecreated word, I belive, is “wedge”.
You misspelled “consecrated”, but more importantly: “wedge” is a “proposed” word (based on the visual appearance of a graph), and has not been “consecrated” by anyone. The most important concept is a plurality of solutions, each inadequate by itself, undertaken concurrently.
Meanwhile, I remember that I have an obligation to provide references for a few of my assertions. I am working on retrieving them.
Comment by Septic Matthew — 29 Oct 2010 @ 12:34 PM
365 AC, why not use some of the logs for log homes? They have excellent thermal mass though they’re hard to insulate. I’m sure we have plenty of long-lived uses for wood.
Dutch NGO Natuur en Milieu (Nature and Environment) opens CO2 market for everyone: buy emission rights and have them taken out of the market, effectively reducing the European emission cap: http://bit.ly/CO2mkt (Dutch)
“… While it may have been cold in northern Europe and in parts of the United States last winter, on a global scale, the winter was actually the warmest on record…. those cold temperatures and record snowfalls were not all that unusual and are actually consistent with there being a global warming trend.
“That new insight comes courtesy of two recent reports: the Arctic Report Card … and a paper published last week in the journal Geophysical Research Letters by Julien Cattiaux of the French Laboratoire des Sciences du Climat et de l’Environnement and colleagues that focused on the anomalously cold temperatures in northern Europe during the winter of 2010.”
Off topic: The Economist has just come out with a review of Pielke Jr.’s new book on the climate-change issue. There is the (as usual) depressing comment list below the review. Maybe people should drop by …
My favorite denialist came up with a mechanism (or maybe this mechanism circulates in the denial-o-sphere, dunno) that could potentially \disprove gw theory\. I thought it was kind of cool and did a little back of the envelope calculation to see if I could convince myself all was well and that RC should close up shop. The mechanism is a negative feedback between CO2 and water vapor which I believe is correct. Addition of CO2 to the oceans should reduce the vapor pressure of the oceans and subsequently reduce the flux of water vapor into the atmosphere. The idea is that you shoot a bolus of CO2 into the atmosphere and immediately put half of it into the upper ocean which will reduce the vapor pressure of the ocean by an amount and that this reduction will result in a decrease in radiative forcing that beats the increase due to the equal sized bolus in the atmosphere and that consequently CO2 gives cooling rather than warming. I thought this not a stupid idea.
The bad news is that my back of the envelope calculation says that it is pretty unlikely to help us. The fractional decrease to the radiative forcing, dRF_vp, due to a bolus of CO2, dC, added to a top layer of ocean containing \W=amount of water in the layer\ should be proportional to: dC/W so that dRF_vp = A_vp dC/W. The fractional increase in radiative forcing due to the addition of of a CO2 pulse of the same size *since the oceans are supposed to take up roughly half the CO2 each year) should be dRF_C = A_C dC/C where C is the amount of CO2 in the atmosphere. If my denialist’s mechanism is to succeed in disproving warming it must be the case that dRF_vp/dRF_C ~ 1. WHich implies that: C/W x A_vp/A_c ~=1. I’m not sure what value one might take for A_vp and A_c but I guess the ratio shouldn’t be bigger than 10. What about C/W ? C is the amount of carbon in the atmosphere. W is the amount of water in the layer in which we’ve assumed the CO2 to be uniformly mixed. I’m thinking somewhere between 1-10m might be reasonable for the first year. Take a meter as a lower bound. Then W = 55,000 mols/m^2 (mols per square meter). According to wikipedia there are 3 x 10^18 grams of CO2 in the atmosphere which works out to 7 x 10^16 Mols which works out to C=140 Mols/m^2 . So that C/W is about .0025 which from my perspective pretty much ends that mechanism. I guess this is sort of amusing in that we are frequently told that CO2 is a \trace gas\ and that it couldn’t possibly have an effect. Of course it is precisely because CO2 is a trace gas that we can affect its concentration in the atmosphere. Similarly It is because it is a trace gas that the warming caused by a bolus of CO2 in the atmosphere beats the cooling caused by an equal sized bolus dissolved in the top meter of the upper ocean. Do you folks buy this back of the envelope analysis? I can back it up a bit more than I have here.
Comment by John E. Pearson — 30 Oct 2010 @ 9:34 AM
Hat tip to this small-family-forestry group http://www.nnrg.org/
for some references that may be useful to others:
“Mark Harmon … is a professor and chair of the forest science department at Oregon State University. His research has been referenced to both credit and discredit the idea of achieving carbon neutrality by using waste wood of forest management for energy….”
“Summer ice volume may be more sensitive to warming while summer ice extent more sensitive to climate variability. The rate of annual mean ice volume decrease relaxes approaching 2050. This is because, while increasing SAT increases summer ice melt, a thinner ice cover increases winter ice growth. A thinner ice cover also results in a reduced ice export, which helps to further slow ice volume loss. Because of enhanced winter ice growth, arctic winter ice extent remains nearly stable and therefore appears to be a less sensitive climate indicator.”
Comment by David B. Benson — 30 Oct 2010 @ 5:23 PM
383 Dappled Water babbled “You don’t appear to understand how serious an issue that is.”:
You appear to prefer not to have any idea about answers to questions such as the one I raised.
Comment by John E. Pearson — 30 Oct 2010 @ 5:53 PM
For John Pearson — that sounds vaguely like Kasting and Ackerman’s paper (1986), http://www.geosc.psu.edu/~jfk4/PersonalPage/Pdf/Science_86.pdf
The tradeoff between surface pressure and vapor pressure as they described it came from arbitrary assumptions made in modeling, and goes away with other assumptions, as he says in that paper. Got anything more recent?
David at 384: I don’t know which is more straightforward, physical or statistical arguments. I prefer arguments that are rooted in physics. Statistical arguments are fine but as you know it is imaginable that we’re in a statistically weird warming period and that CO2 has nothing to do with it. That (plus uncertainties associated with water vapor and clouds) is why the IPCC gives only a 90% chance that the current warming trend is anthropic rather than something closer to certainty. From a typical physicist’s perspective I would think that the IPCC kind of understated the odds. I do find the fact that there are a hell of a lot more water molecules in one meter of ocean than there are CO2 molecules in the entire atmosphere to be pretty straightforward. I certainly didn’t mean to imply that this accounts for the warming. It does, I believe, shoot down the proposed mechanism.
Comment by John E. Pearson — 30 Oct 2010 @ 7:02 PM
386 Hank asked if I had anything more recent. No. Less recent. It’s been known since the 19th century that adding solute to a solution decreases the vapor pressure. That is clearly a negative feedback of CO2 on radiative forcing. My point was that negative or not it is inconsequential. I claim that the key figure of merit is the ratio of the number of CO2 molecules in the atmosphere to the number of H2O molecules in the upper ocean layers and that this is too small for the proposed mechanism to matter.
[Response: Furthermore, the oceans don’t take up half the emitted CO2, rather only about 1/4 of it. The rest goes terrestrial.–Jim
Comment by John E. Pearson — 30 Oct 2010 @ 7:40 PM
For John Pearson
Did you read the paper I linked? Do you see where they compare the tradeoff between the two consequences?
Theory from the 19th century — or the 20th before computer modeling–is not the most reliable source for details of radiation physics, as Weart points out; many details that made a difference weren’t understood until the 1960s, simply because nobody could do the math without machine help.
I’m just trying to figure out where your unnamed friend got this idea, and whether there’s anything published that talks about it.
I pointed to the one paper I could find mention of. It may or may not support your idea, I can’t tell.
That link goes to an old PDF (image file) so I can’t paste in the relevant text for you, but look around the paragraph on p1384 where it says “… little physical significance; a different choice of parameterization could either shift the increase to a different CO2 pressure or eliminate it entirely …. CO2 is not that efficient as a greenhouse gas; thus, the increase in surface pressure caused by the addition of CO2 outstrips the increase in H2O vapor pressure caused by higher [temperature]….”
If there’s something to your friend’s idea, probably someone has published about it.
I was just posting the HadCRU values. Does that require a peer-reviewed research article? Many scientists reference these measurements. If you are claiming that this past winter was the warmest on record, should it not appear in Phil Jones’ database? Where are you getting your measurements?
I agree with the pro-physics drift of your comment but am far less sure about its apparent attempt to absorb different types of uncertainty into one concept and one probability of 90%. The cloud difficulty may help to determine the range of climate sensitivity but that does not mean that it contributes to the 10% probability that recent warming is down to natural variability. And why are water vapour included and aerosols excluded from your list?
Hi John, I’ve been giving some consideration to your favorite denialist’s mechanism. One fault I see is that the water content of the atmosphere is not driven by the equilibrium between ocean surface and atmosphere, but rather by the capacity of the atmosphere to hold the water–i.e. relative humidity. The boundary layer is not a closed system, and so not in equilibrium. It is disrupted continually by wind. So, even if there were a significant effect, it still wouldn’t have an effect on RH. And indeed, it appears that RH has not changed appreciably. So, while clever, it’s wrong.
described here: http://data.giss.nasa.gov/gistemp/2010summer/
“… it is not possible to say yet whether 2005 or 2010 will be the warmest calendar year in the GISS analysis. It is likely that the 2005 and 2010 calendar year means will turn out to be sufficiently close that it will be difficult to say which year was warmer, and results of our analysis may differ from those of other groups. What is clear, though, is that the warmest 12-month period in the GISS analysis was reached in mid- 2010, as shown in the Rev. Geophys. preprint….”
Hank asked at 389 if I read the paper. I scanned it a bit too quickly. Certainly I didn’t read it. And I missed the paragraph in question. I don’t know where he got that mechanism from. When he told me about it he presented it as if it were his own idea but he spends a lot of time posting on right wing blogs so it wouldn’t surprise me to learn that it is circulating in the denial-o-sphere. I’ll read the article a bit more carefully to see if their arguments jibe with mine.
I got a little bit of pleasure out of considering the areal mass density of atmospheric CO2 which is about 6 Kg/m^2. Each year we’re adding almost 60 grams/m^2 of which half stays in the atmosphere. Convert the mass to weight and give it in pounds, ounces, and the area in square yards cause Americans don’t know from metric and maybe someone who otherwise wouldn’t believe it will appreciate that it can easily have a non-trivial effect. I guess that during the industrial area the CO2 partial pressure has gone from 9 pounds per square yard to about 12.
Comment by John E. Pearson — 31 Oct 2010 @ 10:10 AM
Oh, for Dan H again: you asked
> this past winter … should it not appear in Phil Jones’ database?
The 90% confidence level is on page 11 of the IPCC summary for policy makers. (AR4). It says: Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.12 This is an advance since the TAR’s conclusion that “most of the observed warming over the last 50 years is likely to have been due to the increase in greenhouse gas concentrations”.
Very likely means 90% in IPCC-speak. For me that means that it is statistically imaginable that CO2 has nothing to do with the current warming. From the physics perspective I don’t see a way around it but when the experts tell me there is a 1 in 10 chance that the current warming isn’t CO2 driven and I’m presented with a mechanism such as the one my denailist proposed I feel obligated to consider it.
[Response: You are over-stating the converse. The 10% uncertainty is mostly related to a) the possibility that the IPCC assessment of internal variability on multi-decadal timescales is 3 or 4 times larger than any model has been able to produce, or any analysis has concluded, b) the possibility that the specifications of natural forcings are completely wrong, or c) the temperature record is completely wrong. Of these, a) is probably the most likely relative to the others, but I would not say it was likely at all. The statement’s converse has nothing to do with the expected effect of CO2 all other things being equal, which had that been asked of the IPCC authors, would have almost certainly been described as an ‘unequivocal’ warming. – gavin]
Comment by John E. Pearson — 31 Oct 2010 @ 10:24 AM
Dan H., Hank Roberts,
The “Green Grok” blog Hank linked to referred to the past winter as the warmest on record, but linked that to another post pointing out that January-April this year were the warmest on record.
If we take “winter” to mean December-February, then according to GISTEMP at least, Dan H. is right — the winter of 2010 was probably not the warmest on record. 2007 was. The winter of 2010 was the second warmest.
(But the difference is only 0.05 ºC, so taking account of errors in comparing quarterly means, it might be fair to call it a tie.)
In any case, Green Grok’s point stands: globally speaking, the cold, snowy winter of 2010 was actually very warm.
Again, both statistical and physical arguments are important, but it is also important to understand that confidence is not probability. To see this, consider the usual problem of drawing marbles with replacement from a jar. We are told that the marbles are either white or black. We draw 22 marbles and they are all white. This established with 90% confidence that at least 90% of the marbles are white (binomial statistics). We are then given the opportunity of to place a bet at 10:1 odds that the next draw will be a black marble. Do we take the bet?
The answer of course is no. We’ve established that 90% of the marbles are white with 90% confidence, but we have zero evidence that any marbles are black. 90% confidence does not necessarily mean that there is a 10% chance that the proposition is wrong.
St. Louis became known as the Gateway to the West because, in the days of covered wagons drawn by horses, The Mississippi River froze over so that a wagon and 4 horses could drive across the river on the ice.
I need documentation of the above. I forgot where I read it. Can anybody help?
Now you can’t drive on the ice at Davenport, Iowa. There usually isn’t any ice here.
JEP 400: Very likely means 90% in IPCC-speak. For me that means that it is statistically imaginable that CO2 has nothing to do with the current warming. From the physics perspective I don’t see a way around it but when the experts tell me there is a 1 in 10 chance that the current warming isn’t CO2 driven and I’m presented with a mechanism such as the one my denailist proposed I feel obligated to consider it.
I can’t believe nobody has mentioned the footnote. First of all, it’s “at least 90%”, so the remaining unexplained factor is of indeterminate size. Then the footnote says “Consideration of remaining uncertainty is based on current methodologies.”
I expect AR5 will move it up to 95%, since they’re a conservative bunch, and assessing the magnitude of natural variation is a tricky business, as the tree ring folks will tell you. Perhaps AR6 will dare to go to 99%. We will really be feeling the heat by then.
Your link stated that winter 2010 was the warmest on record based on an editorial by Bill Chameides. However, his evidence was a link which stated that April, 2010 was the warmest on record, NOT the winter. Typically winter refers to the three months from Dec. – Feb. NOAA reported the winter of 2009-2010 as the 5th warmest on record. They reported the oceans as being the 2nd warmest, but the land was only the 13th warmest. Much of the differences between the reporting agencies is their handling of non-measured global areas.
402, Ray Ladbury: Again, both statistical and physical arguments are important, but it is also important to understand that confidence is not probability.
Nevertheless, probability theory is used by Bayesians to model strength of belief. The belief and aleatory aspects of probability have always developed together: Ian Hacking, The Emergence of Probability.
“An ice dam formed just north of St Louis last winter. Is that proof that global warming isn’t occuring?” – JiminMpls
No. Assuming you’re serious (which would be hard to credit if I had not come across this sort of idiocy from denialists more times than I can count), have you really not grasped the difference between weather and climate?
Quite right Nick,
The ice buildup during one winter is proof of nothing. Had he presented a long-term data of ice extent on the river or southern expanse, then maybe a parallel could be drawn. That is, assuming there is no man-made intervention with the ice buildup.
It still amazes me how many people claim that global warming is(not) happening due to a heat wave or cold spell.
Nick Gotts: The Mississippi froze over at New Orleans in 1784, but it had help from a volcano. Source: Heidi Cullen’s book on future weather.
The point is an unscientific “proof” that GW has happened. If you set up a good google advanced search, you can find more and more ice as you go back in history. http://lds.org/gospellibrary/pioneer/02_Nauvoo.html shows a tiny painting of horse drawn wagons crossing the river at Nauvoo, Iowa in 1838.
I am talking about 1838, not 2003. Recently, meaning since 1990, there has not been any safe-to-walk-on ice at Davenport, Iowa. There was good ice in late 1968 at Davenport.
Again, regular crossing with horses and wagons at St. Louis is something that could only happen from 1800 to about the civil war or something like that. I have not yet found a history that goes year-by-year with a history of ice extent and dates, starting in 1780. THAT is what I would like to find.
“On the 13th of January, 1862, the Seventh was ordered to embark on a transport for the south. It was marched to St. Louis, and embarked on the noble steamer “Continental.” The weather was intensely cold, which detained the boat till about 9 o’clock at night, when she got under way, the river being full of floating ice, and proceeded down the river about twenty miles, where she was frozen up in the middle of the stream. We remained on board two days, when the ice became solid, and the regiment with its baggage was removed to the shore, took railroad, and returned to St. Louis.” http://iagenweb.org/civilwar/regiment/infantry/07th/7th-inf-hist.htm
You can’t take isolated weather events and make sweeping generalizations about climate change – or the lack thereof. The Mississippi has been so dramatically re-engineered over the past 100 years that even if you had detailed year-by-year records of freezes, you couldn’t draw clear conclusions from that.
For example, the river *never* freezes anymore just north of Red Wing, MN, but it has nothing to do with a warming climate. It’s the Prairie Island nuclear power plant warming the water. Just south of Red Wing, however, Lake Pepin freezes over nearly every year just as it always has. Just south of Lake Pepin, the dam and locks at Wabasha creates another break in the ice – and is probably the premier spot in the world for viewing bald eagles in the winter. I’ve been there when it was -12F and there were at least 50 bald eagles soaring and diving for fish.
Melbourne had all-time record cold temps in July. Does that mean Australia has been getting cooler?
Dan, pointer please to the source you’re relying on for your statements.
I can guess. But you are capable of just copying the link off the navigation bar of your browser and pasting it in. Where are you getting your data?
What measure of significance is used for your statement about significance above? Just tell us where you’re getting your info, that’s what I’ve been asking.
407 Didactylos said: I can’t believe nobody has mentioned the footnote. First of all, it’s “at least 90%”, so the remaining unexplained factor is of indeterminate size. Then the footnote says “Consideration of remaining uncertainty is based on current methodologies.”
I don’t know what they might have used other than current methodologies. The only other choices I can think of are old/obsolete or nonexistent methodologies. I copied the line in which they said that the probability (that the current warming was anthropic in nature ) had increased from likely in TAR to very likely in AR4WG1. I don’t see how any rational non-expert can argue certainty when the IPCC says 90%. As you pointed out the IPCC has two higher categories that they could’ve used but didn’t: extremely likely (>95%) and virtually certain (>99%). Personally I found chapter 9 of AR4WG1 a little bizarre. They wrote: “Greenhouse gas forcing has very likely caused most of the
observed global warming over the last 50 years. This conclusion
takes into account observational and forcing uncertainty, and
the possibility that the response to solar forcing could be
underestimated by climate models. It is also robust to the use
of different climate models, different methods for estimating
the responses to external forcing and variations in the analysis
But there is no discussion as to why they didn’t say “extremely likely”.
Comment by John E. Pearson — 2 Nov 2010 @ 11:09 AM
420 has the link to the raw HadCRU data, and 375 has the link to the graph of the same data. I have provided the links, what more do you want? I am a chemist, I know what significance means, and 95% confidence.
You say that there is no way that that 60,000 square miles of standing forests could hold enough ‘carbon’ to matter, but you can comprehend that sinking enough logs in oceans or lakes somewhere would be ‘enough’.
Actually, in my approach you could do anything at all that would preserve logs harvested as a part of managing mature forests, including making lumber for permanent construction, burying it in oxygen depleted situations where bacteria do not live. Being of economic frame of mind, I see no real advantage in burying good wood.
Branches and chips etc, could processed into biochar and burying that charcoal as stable carbon, yes really carbon. (I can’t find the persuasive mention of biochar here in past comments, but it seemed reasonable.)
You keep asking about quantitative info. Is it the 60,000 square miles you missed in my first rough estimate of a sufficient sized forest project? I believe I said a 3000 mile aquaduct, with 10 mile branches on each side, would be enough to match the on-going CO2 release from coal fired power plants in the USA. That would be after some years of lag while a high rate of growth became established. I compare that lag to the lag for the EPA to force coal fired power plants to upgrade to that EPA’s idea of CCS.
(Repeating a discussion of EPA planning: The announced plan by the EPA is to require ‘best available technology’ and the recent report by them (Sept 2010) said ‘carbon’ capture would cost up to $95 per ton of CO2. Working this out in terms of the burden on the use of a ton of coal shows that the burden for use of a ton of Powder River Basin coal (half the element carbon by weight) will be about $180 per ton of that coal, and higher carbon coal would incur proportionately higher burden, up to around $320 per ton.
Hank Roberts, you say that NAWAPA was done in detail, so it is an old idea, suggesting of course that there is no reason to think on it again in present circumstances 50 years later, but if I could show that new forests could be established with new water that would add something new.
So were I to step up and point out that orchards, having characteristics similar to trees since they are in fact trees, stretch as far as the eye can see in the California Central Valley lining the California Aquaduct; you could associate my discussion with NAWAPA, whatever the heck that was?
Then Whammo, along comes Ray Ladbury who says NAWAPA involved nukes? That is a slick way to argue against anything you don’t like.
Why do I feel like Charlie Brown?
Actually, I discuss using a large labor force to create the needed water system, and this force would be drawn from our underemployed work force, thus relieving another problem situation. No nukes needed!
Not being inclined to see conspiracies around every corner, I am sure this had no coordinated intent.
As the election results roll in, it is abundantly clear that the United States will do absolutely nothing to curb greenhouse gas emissions. The rest of the world must take action: Place a 100% tarrif on all US goods and services until the USA reduces carbon emissions top 20% of 1990 levels.
Hello all, I hope I am not hijacking the thread (I wish RC had a general question thread)
, but my good buddy Wally over at Climate Skeptic has done the following critique of Gavin’s paper “Spurious correlations between recent warming and indices of local economic activity,” which is itself a response to McKitrick, R.R. and P.J. Michaels (2007), Quantifying the influence of anthropogenic surface processes and inhomogeneities on gridded global climate data.
Does anyone know these papers and could anyone shed some insight in Wally’s somewhat one sided critique?
First, he talks about the de Laat and Maurellis 2006 paper (referred to as dLM06). His basic criticism seems to stem from a belief that this result doesn’t break some sort of confidence threshold and is subject to several biases. The main bias he talks about is starting point. Look at Figure 2. He started the same analysis at 6 different time points, every 10 years from 1940 to 1980. What he found was strong negative temp correlations to CO2 output in one case. Small negative in one, small positive in another, some what medium positive in one, and strong positive in two.
This was an odd way to approach this problem. It would seem far better to simply take 1940 to 2003 as one large data set instead of breaking it up into 6, 23 year over lapping windows. If Gavin’s real motivation was to prove what is going on, I suspect this what he would have done that. Instead, it appears his motivation is to simply discredit the other work and say “see I can do this 5 other times and get 4 different results.” That’s nice Gavin, but what I really want is the truth. But it does seem the over all trend between the 6 runs is positive (two strong +, one mild +, one weak +, one weak -, one strong – would seem to add up to one strong + plus one mild +). Given the overlapping nature of the 6 runs however, its pretty much impossible to figure out just how much, however. As comes up later with the other paper, these tests where non-independent Gavin….
Also, he makes no attempt to actually test how well this method does rise above what ever significance threshold he cares to use. Such as in figure one, he shows how noisy the data is. Great, its noisy, we know that. Its climate data. But that doesn’t mean you can’t find a significant change in the mean. So yeah, the noise (here shown as 1 SD in gray) goes from 0 to .6 C/decade in figure one, but the mean is still up at ~.4 C/decade. Well Gavin, what’s the confidence in that mean being significantly different from zero, or the raw, un-UHI-adjusted temp record?
This was a half cooked criticism that with a little more time might have actually proven something useful. It does make me wonder, given Gavin’s obvious biases from a strong personal investment in one particular side of the debate in a private blog, if the more in depth analysis showed something he didn’t like and he stopped half way.
Now our MM07 paper criticisms by Gavin:
1) Lower trophosphere will naturally have less outliers than the surface because it more well mixed, thus you might get random correlation with human activity and outliers.
No duh Gavin, this is stat 101 and why you create a large sample size and control group. MM07 did that. You don’t have anything here.
2) Adjacent areas are not fully independent with regard to temp.
Again no duh, but that’s the point. The adjacent areas are that “control group” that are supposed to make for decent proxies for what unbiased temps might be. There is a little to be said that maybe you still see an effect in adjacent boxes from human activity in the other box, but if you go farther out, you lessen the ability to find what the temp “should be” in the UHI box in question. Really don’t see anything of consequence here either.
And I like his little quote: “Some indication of this is given by the fact that the largest ‘contamination’ deduced from their methodology are in very remote polar regions such as Svalbard or the South Orkneys, hardly sites of significant industry”
That’s rather throw away here. Maybe human activity has a larger effect in polar regions for any number of reasons, like say replacing highly reflective white snow/ice with asphalt? Point not made Gavin.
3) He reran the analysis with slightly different methods with the updated CRUv3 (MM07 used CRUv2). He states:
“More interestingly, the significance of correlations to population, income and GDP growth disappear, pointing clearly to the fragility of these relationships.”
I would argue population as well as income growth and GDP growth are poor markers anyway. Population is a poor proxy for the type of land use, economic output and general infrastructure of a city. However, population did not “go away,” it was reduced by about 3/5th (.38 to .15). GDP growth, being a derivative, is going to swing greatly from positive to negative based on any number of factors. So UHI effect might not actually change in say New York, but the GDP estimate might go down, or stay flat, or go up. Seems silly to use this. Similar with income growth. Look I went from making a dollar a day to 3 dollars per day. I tripled my income! But here in the US, you went from $45K/year to $50K/year and you only went up 11%. Similarly silly. However, here again the correlation did not “go away” it went from .409 to .266.
Gavin is clearly over stating his case. “Significance” never goes away, since “significance” is a completely arbitrary term. One might think a correlation over <.5 isn't significant, and another might think <.25 isn't significant, and neither person is right or wrong. Its better to just tell us what the values are, and we can judge for ourselves. Like here, so the correlation dropped, but a correlation of ~.25 is nothing to thumb your nose at. Maybe he's got a case with .15. But you have to understand these are imperfect predictors of actual land usage, and types of economic activity, and all have some sort of faults. Thus getting correlations from them of this scale might actually be considered pretty strong.
"The economic indices ‘g’ (Total GDP), ‘e’ (education) and ‘c’ (coal use) do, however, remain nominally significant (under the MM07 assumptions)."
Right, some of the better markers remain quite strong.
4) He talks about the degrees of freedom between these multiple regression that MM07 ran. I'd agree. GDP is not going to be independent from education or coal usage, for example. But it doesn't really matter. We got a very strong correlation coefficient from probably the best maker we can use, GDP, at .44-.55 depending on the data/method used.
5) Gavin apparently then wishes to run this analysis against his GISS model? Not sure what kind of BS he thinks will come from that, but its plainly stupid. His model is not real data.
So Gavin brought up an occasional useful point, but I do not believe anything he said materially effects the general result of either paper. It might be possible the UHI was slightly over estimated (say small factors from non-independent variables, or using a larger data set in dLM06), but Gavin hardly makes a convincing case of that, and I see absolutely ZERO data, analysis and result that actually supports his statement in the abstract of " there is no compelling evidence from these correlations of any large-scale contamination.” MM07 and dLM06 both make cases that there is a significant effect on the temp records from the UHI effect, and nothing Gavin said leads me to believe they made any mistakes or oversights to change that general resul
[Response: I’m on travel this week and so I can’t provide any kind of an depth response, but you havehe has clearly not understood what the point of the paper was. The fact of the matter is that for both dLM06 and MM07, correlations that are just as significant appear when you do the same analyses with model output that has no ‘contamination’ – and this goes for the various attempts that Mckitrick has made to rescue his original result. The correlations are spurious and do not stand up to scrutiny (remember too that dLM06 also found an effect in the satellite data). However this topic is OT on this thread. No more please. – gavin]
*sigh* No discussion of albedo? Dr. Joel Norris of the Scripps Institute of Oceanography gave an excellent colloquium presentation on this topic to scientists at Fermilab National Laboratory, please see the video here:
BTW, Dr. John David Archer just gave his “Long Thaw” talk to Fermilab, and I met him & obtained his signature in my copy of the book! Not bad, eh?
In his book, he seems to agree with me on the importance of oceanic acidification as an immediate concern, as does Dr. Norris & the folks at Scripps. I wish this would be discussed more on RC, we are seeing some severe impacts in the oceans, particularly in the high latitudes.
Claiming that a 60’000 square miles forest would make a difference is ludicrous. It’s the kind of thing one would normally not bother to respond to. But what I’d like to see since you keep posting about this in one thread after another and since the moderators keep humoring you is how you figure it would offset emissions from coal plants, quantitatively. How did you come up with a C uptake of about 40t/ha/yr? And you long is that supposed to last before you need a newer, larger forest to offset growing emissions?
The reason burying logs (or biochar but that would involve more work) has more potential is that burying (or sinking) adds up over the long run if it’s done over and over again with biomass from the same forest while letting the forest stand only captures CO2 once (and only as long as the forest survives). Of course letting the forest stand is going to remove more atmospheric CO2 in the short run but forests are not a short run thing anyway. The only thing which can possibly work to keep atmospheric carbon in check in the short run is unrealistic emission cuts. By long run I mean century-scale as opposed to decade-scale.
Comment by Anonymous Coward — 3 Nov 2010 @ 9:13 AM
The North American Water And Power Aliance was a program to bring water to the great plains. One proposed project was to use nukes to block and then reverse the flow of the Yukon. It was part of the program to find peaceful uses for nukes.
The idea of ocean acidification due to increased atmospheric levels of CO2 has been greatly overblown. The theoretical change in pH from 350 ppm CO2 to 700 ppm would be ~0.15 pH units. However, that would be in pure water. The oceans are anything but pure. The excess calcium in the seas would bind with some of the CO2 to orm limestone (CaCO3). This also ignores the biological action of the seas which tends to pull H+ out of the water. The buffering capabilities of the world’s oceans has been largely underestimated.
[Response: So you’re an ocean chemist then? References for those numbers? First of all, if you take 280 ppm as the pre-ind. baseline (as you should), there’s already been a pH decrease of ~ 0.1 unit, which, since the scale is logarithmic is roughly about a ~30% increase in [H+], and which would further increase to something like 100 to 150% at CO2 doubling of 560 ppm. At your 700 ppm this would be an ~ 200% increase, which is to say, ~ 300% of the pre-industrial [H+]. Sorry to say, you haven’t just nullified the findings of hundreds of ocean scientists over the last century plus with your claims–Jim]
@Comment by Dan H. — 3 November 2010 @ 12:24 PM, and others on RC:
I’m a biochemist at University of Illinois, performing research on algae biofuel. I’ve studied atmospheric carbon deposition since 1979, focussing on biomethane capture and destruction from manure systems.
This is the problem – the dissolution of carbon dioxide occurs at the interface of air and ocean water, which is also where the great majority of oceanic photosynthesis occurs (uppermost 10 feet or so). So, the production of carbonic acid throughout the water column is irrelevant, since the impact is most felt in the uppermost layer.
Calcifying algae such as coccolithophores are severely impacted at this acidified interface, and these guys are the foundation of the food chain as well as one of the primary methods by which carbon dioxide is converted to carbonates (in the form of exterior carbonate scales). Algae are either consumed or die, sinking to the deep ocean floor & carrying the carbonate with them. Dr. David (!) Archer seems to be in agreement with this from what I’ve read.
At a present CO2 concentration of about 390 ppm, these effects are being observed (refer to the EPA database). The mistake everyone keeps making is to look at atmospheric CO2 dissolved into the ENTIRE volume of the ocean!! Just worry about the interface. There’s a bit of recent news about this, please see: http://www.abc.net.au/news/stories/2010/10/13/3037651.htm
The focus upon temperature alone is short-sited. Those of us who want controls on atmospheric carbon deposition must advocate broadly on a variety of environmental changes. The public is getting testy about long-range forecasts, but we are facing collapse of oceanic food chains in the not so distant future. If that happens, we better get used to eating jellyfish!! (a line from one of my colleagues)
The whole problem of ocean acidification affecting the base of the food chain is something I find really disturbing, to put it mildly. Here are a few other articles that have high rates of being cited. Some I’ve read, some I have not yet, so I can’t vouch for them, but they’re good starting places for someone who is interested in the subject.
You seem to find it objectionable that I keep discussing the standing forest as a solution, but you continue to illustrate that we are not adequately communicating. You also help to bring out details, and though it sounds from your tone that you find that distasteful, thanks anyway. I might add that I have not intended to blurt out a full project plan, but am really putting details on as I go. I started with a general concept that looked interesting.
I would include selective harvesting from a mature forest as a long term part of my suggested plan. I would seek to make this an economically viable harvest, where wood was put to good use as much as possible. Biochar would be a cleanup level activity if it was found to be meaningful.
You would include large scale harvest and burying of logs. And you seem to think this would not be so labor intensive. I think you are wrong about that. Yes, getting the logs to some body of water where bacterial action is not going to happen would also serve to capture CO2, and the long term process could go on as long as there was sufficient water storage. I can see that your approach would require less acreage, but I reiterate, it has not the fundamental economic feature of being self supporting. Buried logs are not a product.
And yes, there would be a need for continuous expansion of the mature forest since the greater capture rate is as you say, during earlier growth phases. I am not thinking much beyond the 50 to 100 year time frame, during which it would be appropriate to develop other means of power generation than coal.
I particularly look to natural gas, though not in the central power plant system that we now have, rather in a cogeneration arrangement, mostly using micro sized systems at households. Micro systems in households would enable use of heat for a variety of purposes, including obvious space heating and so forth, but also for refrigeration and air conditioning using vapor diffusion methods. And of course, nuclear done right, is still something that should not be ignored. I also have a bizarre hope of utilizing engines of small hybrid vehicles in the cogeneration function. That seems not to be something that will happen immediately, but it could as time goes on.
Even on the 50 to 100 year time frame it becomes reasonable to expect much better insulation in buildings, since that is something like the useful time frame of buildings, and it is much more cost effective to build with good insulation than it is to remodel old construction.
So feel free to stomp around with indignation, and maybe we will get somewhere.
It sounds like Canadians understand these things quite well, and this shows that getting involvement of Canadians would require offering them a serious degree of participation on the productivity side of things.
Extracting from a Wall Street Journal article of Nov 2, 2010:
“- – – British Columbia has spent the better part of a decade lobbying Chinese officials to adopt a “Wood First” program to mitigate the effects of global warming. Touting timber as an alternative to steel, glass, and especially concrete, Canadians have sent teams of engineers and lumber experts to persuade officials in cities like Shanghai and Chengdu to adapt municipal building codes to allow for more wood construction in multistory structures.
In some cities, Canadians are building demonstration models of six-story apartment houses and other public buildings to show what wood can do. —“
“The California Air Resources Board developed … 70 measures, such as a low-carbon fuel standard, an increase in electricity generated from renewable resources to 33 percent, and a cap-and-trade program for 360 utilities, refineries and other emitting industries. The state released its more than 3,000-page rules for that greenhouse gas permit trading program on October 29….
… The majority of the efforts are … standards … such as a new limit on energy usage for televisions with more than 42-inch screens….
… The low-carbon fuel standard … final rule is expected in early 2011 … the board is struggling to understand the greenhouse gas emissions … directly and indirectly impacted by growing crops for biofuels. ‘It really is the single issue holding back the low-carbon fuel standards’ ….”
“The issues surrounding tree planting to offset deforestation are discussed ….
… reforestation of cleared areas is not always successful, because of drought and invasion by competing species (native brush, plants used for erosion control, invasive exotics, etc.). In some areas, natural succession may require years or decades to reestablish tree cover, and climate change may prevent such “normal” recovery.
… Naturally regrown (second-growth) forests in the tropics have been shown to contain less biological diversity and less total biomass (carbon) than intact forests, and forest plantations in the tropics have far less diversity and biomass than second-growth forests.130 Thus, under many circumstances, deforestation in tropical forests emits substantial quantities of carbon that cannot be adequately compensated by reforestation except in the very long term (several decades to centuries).”
It sounds like Canadians understand these things quite well
What the BC government and the BC forestry industry understand well is marketing ploys, they need desperately to find a market for their lumber from clearcuts as the US housing industry collapsed and US clearcutters have gotten protectionist – BC has been trying to get a share of the Chinese market since way before climate change became a trade tool, using any hype they can think of, but the purpose is definitely not to “mitigate the effects of global warming”, it’s to maintain transnational corporate profits – most of the major BC forestry companies are American owned and none of them have an interest in climate change beyond how it might be used to enhance their bottom line.
In your 50-100 years timeframe, your 60’000 square miles super-forest would have to fix 2-4ktC/ha to offset the emissions from coal (assuming you would grow the forest beyond its original size to keep up with emissions growth). If I wasn’t stomping in indignation, I would be rolling on the floor!
As to making use of the wood as two of you have suggested, again the numbers don’t work. We wouldn’t know what to do with so much wood. How you figured how many six-story wooden buildings in China would need to be built every year to sequestrate the emissions from their coal plants?
Comment by Anonymous Coward — 3 Nov 2010 @ 6:28 PM
It seems like the Canadians are demonstrating how government and corporations can work in the common interest.
I did not mean to suggest that corporations exist for any purpose other than economic gain. Non-profits might be different, but even there it often turns out that the organization exists for the purpose of remunerating those that work there. I expect no morality from corporations and when there is a pretense of it, I take a strongly skeptical attitude.
However, where a corporate activity is in concert with public interests, it is appropriate that they brag about it and even seek ways to work for whatever common cause they can find.
Also, since I expect no morality, I realize the limits of what we should expect from corporations, and I know that government has to step up to fulfill its purpose as the creator and, especially, as regulator of the corporate entity.
However, the corporation serves a vital purpose in organizing projects that we all benefit from, both from the corporate operations such as providing electric power, and from providing an investment mechanism on which much of our future financial security is based.
This all leads to the need to keep a massive forest project in the public domain, literally, such that it serves the common good. This does not exclude contractual arrangements of whatever sort are appropriate.
You need to be careful not to stomp on your own toes. This same magnitude at which you scoff is equal to the magnitude of trees that you suggest could be stored in a lake at oxygen free depth. Note that national land holdings vastly exceed available lake storage capacity.
But since we are getting to numbers more precise than I have yet to do, we should first clarify what 2-4kTC means to you. Are we using EPA nomenclature where C refers to CO2? If so, ignoring water content, that is estimating wood mass as if it were dry, we get 1-2kTC per ha needed to be captured over 50 years and held permanently. Using round numbers of 100 large trees per ha, that means 10 – 20 tons per tree. A fully loaded logging truck might carry 20 tons(max for CA 18 wheeler trucks is 80,000 lbs.with 25,000 lb for the tractor and 15,000 lb for the trailer,leaving 20 tons for payload), and one large tree can be something like a load. So maybe we are in the ballpark? This all depends on tree type and so forth.
I would say we are close enough to have something to work with here.
And once again, I count success in varying ways, not necessarily including 100% capture of all CO2 from all existing and future coal power plants. The EPA is not thinking of any such wholesale implementation of CCS either. And I repeat, this is not a solution to forever burning of coal in central power plants as we do it today.
It seems like the Canadians are demonstrating how government and corporations can work in the common interest. the common interest? economics
This all leads to the need to keep a massive forest project in the public domain, literally, such that it serves the common good. see lead pic of clip 2, good example of public forest in BC, definitely no morality involved, corporate or regulatory.
I have been trying to steer clear of the horrors of deforestation all over the world, and instead, simply construct a completely new system. This does not mean that there is not a lot to learn from those working to avoid deforestation.
I guess it can be said that the tree project I bring up could be measured against the need to balance deforestation, but that is a battle I can not win. Thus, I leave it to others, and simply relate to ‘carbon’ capture being planned by the EPA. Maybe someone can think bigger than me on this, but for now, it seems a big enough big-think for me.
And let me repeat, this is only a matter of me bringing up some old stuff and relating it to a different problem, in a slightly different way. And I see no way to make any money on it.
Thanks for that link. Canadian corporations and USA corporations seem equally adept at misusing resources.
I think the mistake is in believing anything said by a corporation. Anywhere, of any kind, always.
And still we need this form of business entity.
Government has a job and the only way it will do it is if people really understand what is going on and demand that governments step up to their job. If that does not happen, capitalism will certainly fail.
I try to operate on the assumption that some sanity will prevail. So far, there is not a lot to support this. I have long believed that the reason capitalism has looked so good is that there have been vast resources to plunder.
I have long believed that the reason capitalism has looked so good is that there have been vast resources to plunder.
And you think the way to “fix” this is to further plunder ever diminishing resources in order to maintain that capitalist economy for an increasingly unsustainable population in the face of the diaster it has wrought? Incomprehensible.
Well, I guess you see using water otherwise draining into the ocean as plunder. Or is it plunder to plant trees on land that otherwise is not doing much? These uses of resources do not seem like plundering to me.
I argue for controlling corporate actions in a way that keeps their actions within limits such that they serve the common purpose of us all. But I do put our common purpose in context of our present developed world standard of living. Of course, we could give up on that, but I would not want to try to sell 18th century lifestyles.
I see plundering as the waste that comes about from our system of central power plants, where much more energy in heat is wasted than is converted to electricity. I look to fixing that over time, but do not see that it can be done except on a gradual, long term plan.
“the United States will do absolutely nothing to curb greenhouse gas emissions.”
Obama’s EPA is shutting down Mountaintop Removal coal mining for “environmental” reasons [water pollution].
Obama’s Department of Labor is shutting down an underground mine because of safety violations and deaths of miners.
Nothing was done for the previous 30 or more years on either of these situations.
Obama’s EPA is considering changing the rules so that coal ash will be treated as hazardous waste. It is more correct to say that coal ash is low level radioactive waste, but it must happen one step at a time. Disruptions must be limited to a manageable level. Treating coal ash as hazardous waste puts coal ash within the jurisdiction of the EPA without having to change any laws.
The engine making the CO2 can be shut off either at the input end or at the exhaust end. Coal ash is the exhaust end. If coal ash cannot be disposed of cheaply or at a profit, burning coal becomes impossible. Electric utilities will be liable for cleaning up the hazardous coal ash that they have already created. WHEN coal ash is admitted to be radioactive waste, burning coal will be impossible and a great controversy will ensue. http://www.ornl.gov/ORNLReview/rev26-34/text/coalmain.html
Conclusion: The Obama Administration is taking action at the maximum bureaucratic speed to curb CO2 emissions in the absence of congressional action.
All 3 of the above actions raise the cost of coal. These 3 actions are the “stealth” strategy for limiting CO2 emissions. They make the alternatives more attractive. Coal is the source of energy that makes the most CO2. In all 3 cases, old laws are enforced for the first time. New laws are not required. All that is required is a president who is determined to do something about CO2.
We could even say that the Cap & Trade bill was the camouflage or sacrificial distraction or feint attack.
Obama is very cleverly doing quite a bit to curb greenhouse gas emissions. WHEN coal ash is admitted to be radioactive waste, our CO2 emissions will drop 40%. There will be a great emotional reaction nationwide and worldwide. Coal will become “radioactive” in the emotional sense. This will be a psychological change equal to the change caused by 9/11.
Never again say “the United States will do absolutely nothing to curb greenhouse gas emissions.”
I try to operate on the assumption that some sanity will prevail. So far, there is not a lot to support this. I have long believed that the reason capitalism has looked so good is that there have been vast resources to plunder.
Well, I guess you see using water otherwise draining into the ocean as plunder. Or is it plunder to plant trees on land that otherwise is not doing much? These uses of resources do not seem like plundering to me.
what makes you the judge of whether any given “vast resource” is being/has been “plundered” or should be “used” for your geoengineering schemes to “serve the common purpose”? How you define plunder obviously doesn’t include any understanding of reality.
Why don’t you determine for yourself how much carbon your super-forest would have to fix to offset emissions from coal plants? You could assume that I mean C when I write C but it only takes a couple of minutes to run the numbers for yourself thanks to the web. Maybe you’ll catch a mistake I made. Then take a look at the material Hank referenced to see if it would be realistic to expect your super-forest to achieve that goal. There’s no need to estimate how forests work by the way of trucks!
If you’re not aiming at offseting 100% of emissions from coal plants then what percentage are you talking about? You claimed this was an alternative to raising the price of electricity, remember? The main point of raising the price of electricity is not to finance CCS but to move away as quickly as practical from the wasteful use of coal for baseload electricity generation.
I talked about the potential of biomass burial/sinking and of biomass/biogas CCS to reduce the amount of CO2 in the atmosphere elsewhere on RC. But their potential is limited. I am certainly not suggesting one could offset the current level of emission that way. Emissions would have to be reduced by at least 80% from current levels to begin with (ideally they would be reduced to a much smaller fraction of course). This would of necessity involve raising the relative price of electricity in many US regions (as well as in some other countries).
Could you disclose your numbers for “national land holdings vastly exceed available lake storage capacity” by the way? I’m not sure what you mean but the numbers should make it clear.
Comment by Anonymous Coward — 4 Nov 2010 @ 2:20 PM
Humans are interfering with nature’s way of keeping forests healthy – forest fires. They kill off far more beetles and diseased trees than healthy trees.
I just noticed your comment elsewhere (Joe Romm site),
#25. David B. Benson says:
April 26, 2009 at 5:25 pm
Jim Bullis — Use 100,000 km^2 = 100,000,000 ha to grow Miscanthus at 12 t C/ha/yr. Biochar will be about 40% yield (pyrolysis oil about the same), that’s 480,000,000 t biochar, enough for almost half of US coal consumption. Could (if there were water) do this close to here and then UP & BNSF would be happy to move the stuff to the power plants.
I am sure you must be pleased to have contributed to the present effort to establish a Water and Trees Project.
1 km2 is 100 ha, not 1000 ha. Being an order of magnitude off will ruin the best plan. The US coal consumption figure is off in the other direction, but not by nearly as much so it doesn’t offset the conversion error.
Comment by Anonymous Coward — 7 Nov 2010 @ 2:29 PM
Amazon is tidal river (5m)
once were huge tidal rivers on northwest Australia.
BETTER CLIMATE more energy, food, land and water.
Use mighty power of nature. In the northwestern Australia, we have huge tides,
huge evaporation and huge dry rivers and lakes.
Tides are up to 12m. Evaporation is up to 4m per year and can be increased.
Huge 12m tidal erosion can revive old dry paleo dormant once mighty rivers, creeks and lakes,
desalinate the country and change deserts to rain forests to provide more rain across Australia.
World population is growing rapidly and we need more energy, food, land and water.
see: Mitic CLIMATE ENGINEERING http://www.climatechange.gov.au/en/submissions/cprs-green-paper/~/media/submissions/greenpaper/0929-mitic.ashx
this will change deserts and whole continent for better climate
environment, provide hydro energy, permanently.
Key fact you probably already know, assuming you did some reading on this, but if not, consider what a difference this makes in your idea:
“… hundreds of thousands of contiguous acres–of DRY grass. 8 feet tall.
[Switchgrass] has to be big, mature, and dry …. if you cut it when it’s green, you seriously weaken the roots–no crop next year. And if you cut it when it’s green–you’ll either have to use it right now, or spend energy drying it, so it won’t rot….”
Off the top of my head, the process of harvesting that you describe loses the advantage of storing Carbon compounds that standing tree mass provides. Unless, there is significance to the root mass which survives.
Still, the fuel, generated to replace other fuels, that could be worth it. It could be particularly worth while as the trees first start growing.
If we get water to land that is not already good cropland, this is a winner in any case. I still vote for big trees but this all needs to be carefully studied.
A question. Why do sea level rise predictions stop at around 2100?
[Response: Simple — 2100 is the number most of the work has been focussed on, I suppose with the idea that it doesn’t sound ‘too far off’. Of course, we’ve written about this, and how it leads to a sense of complacency, since sea level just gets higher the longer you look forward.–eric]
I hope the scientists don’t let the denialists claim the religious “high ground.” I always ask Cuccinelli’s deputy W. Russell if the Holy Father is a greedy liar, but he doesn’t answer that question, either. Many Christians don’t realize that the leaders of their churches do accept the science of global warming and believe it is a social justice issue.
“The scientific evidence for global warming and for humanity’s role in the increase of greenhouse gasses becomes ever more unimpeachable, as the IPCC findings are going to suggest; and such activity has a profound relevance, not just for the environment, but in ethical, economic, social and political terms as well. The consequences of climate change are being felt not only in the environment, but in the entire socio-economic system and, as seen in the findings of numerous reports already available, they will impact first and foremost the poorest and weakest who, even if they are among the least responsible for global warming, are the most vulnerable because they have limited resources or live in areas at greater risk…Many of the most vulnerable societies, already facing energy problems, rely upon agriculture, the very sector most likely to suffer from climatic shifts.”—Permanent Observer of the Holy See to the United Nations, H.E. MSGR. Celestino Migliore (5-10-07)
Jim, one more time then I’ll give up.
If you would
— listen to yourself first
— type your assumptions into a search box
— adjust what you believe based on what you find
— cite your sources when you decide, after thinking about that, to post
Then you would not post your ideas and have other people telling you they have already been thought through by others. If you show you’ve studied the area yourself, people will pay more attention.
This led me to post this suggestion again, one last time. You wrote:
> If we get water to land that is not already good
> cropland, this is a winner in any case.
I’ve looked through several papers, and I can’t find this. What is the predicted range for the annual rate of sea level rise in 2100, when sea level is expected to have risen 75 cm to 190 cm?
It has to be a whole bunch more than 3.2 mm per year.
Under BAU, assuming the rate stays in its normal relationship with temperature increase, will the rate of change in the annual rate be about the same in the 22nd century as it is predicted to be in the 21st. If not, why?
Hank – I’ve already read those. If what I am asking about is in them, I’m too dense to see it. Rahmstorf 2009 finds scenario B1 is 81 cm to 131 cm above 1990 sea level by 2100. A1F1 is 113 cm to 179 cm. The the other scenarios lie between them.
What I’m trying find is the predicted range for the annual rate of sea level rise in 2100. Right now the annual rate is 3.2 mm. In 90 years, that’s not 81cm to 179 cm of sea level rise. I can guess, but I don’t think the public, which is me, should have to guess.
To be honest, when lay people read “linear”, and they see the historic rate, they tend to extend that historic rate 100 years out and shrug their shoulders in “hey, no big deal.” Saying total sea level rise over 1990 is likely going to be 1.14 meters in 2100 gives them little sense of how rapidly it’s happening in 2100, and no clue where it will be in 2101; hence, a lot of silly talk about building simple seawall solutions, etc. The Netherlands did not subdue an ocean that was rising by more than one meter a century. Nobody has ever done that.
The 22nd Century is essentially top secret as far as lay people are concerned.
#477–Seems to add up; the rate for A1F1 is 9.7 mm/yr. Call it 10 for simplicity (surely allowable given the error bars), and 90 years x 10 mm = .9 meters. That is the highest estimate, but of course is not really the upper bound due to excluded ice dynamics effects such as Hank links to in #478, and uncertainties in the estimates themselves.
Although it’s an illuminating reference in many ways, though it doesn’t really address the question JCH had about changes in SLR over time.
The 1.9 m rise was from work subsequent to AR4–the “Copenhagen Diagnosis” report, IIRC, which did consider the effects of ice dynamics.
“The 1.9 m rise was from work subsequent to AR4–the “Copenhagen Diagnosis” report, IIRC, which did consider the effects of ice dynamics. …”
I had not read that one, Kevin, so thank you for that tip as it clears up much of my muddle. It shows an upper bound (German Advisory Council on Global Change, WBGU, 2006) of 5 meters (3.1 meters most likely?) by 2300, so now the 22nd Century exists with respect to SLR.
Does Vemeer-Rahmstorf 2009, 75 cm to 179 cm by 2100, include ice dynamics in a complete sense, or is there still something being left out because of uncertainty?
My take on the whole (very complex) matter is that rate of SLR will likely be constantly accelerating (with some pauses, possibly as long as a few decaded) for more than a hundred years.
There’s no secret (that I know of). If you want a figure, no one is able to make a halfway reliable prediction unfortunately. Depending on how sure you want your bet to be, you might have to plan for a very large range of SLR by whatever 22nd century date you’re concerned about.
You ask about Vermeer & Rahmstorf, 2009. They are straightforward: “highly nonlinear responses of ice flow may become increasingly important during the 21st century. These are likely to make our linear approach an underestimate.”
They also state “To limit global sea-level rise to a maximum of 1 m in the long run (i.e., beyond 2100), as proposed recently as a policy goal, deep emissions reductions will be required. Likely they would have to be deeper than those needed to limit global warming to 2 °C, the policy goal now supported by many countries. Our analysis further suggests that emissions reductions need to come early in this century to be effective.”
This ties in to carbon sequestration through biomass actually. Check the old thread if you’re curious.
Be careful with alarmist arguments based on “no one has ever done that”. Before the Dutch achived what they did, no one else had!
Building a simple seawall for a 1-2m/century SLR isn’t much of a problem, assuming current infrastructure and resources. There are some largish anti-flooding works being built at great expense right now. The problem is that simple seawalls keep water inside the wall from flowing out. And more complex schemes can only deal with so much SLR without creating huge flooding risks.
Comment by Anonymous Coward — 8 Nov 2010 @ 1:20 PM
AC – yes, but is V-R-09 still an exclusively linear approach? It sounds like it is.
If so, Kevin said they used dynamic changes for the Copenhagen Diagnosis. The Copenhagen Diagnosis appears to use V-R-09 through 2100, then cites two other research sources for 22nd century prediction that apparently include dynamic changes in land-based ice. Does that sound right or wrong? Linear to 2100, both linear and nonlinear to 2300?
For a long time I followed SLR each week, but the news pace was so slow I quit. Now I’m like 59 cm underwater.
On seawalls, what you are saying is interesting. Galveston just flooded -Ike. Had the inhabitants stayed in place, the loss of life may have been pretty high by modern standards, but nothing like 1900. I’ve read varying reports, but it sounds like most of the water entered from the unprotected backside, and now they want to seawall that. Few Texans believe in climate change, so, unless the Texas State Climatologist intervenes (and good luck to him on that), there is no reason why a new seawall in Texas has to be much taller than the post-1900 seawall. Our seawall for the Wall Street of the South is going to a lot cheaper than the AGW-based seawall they’re planning for Wall Street.
When it comes to a scale similar to US shorelines, we can’t, fortunately, even build a border fence, and the illegal aliens are not getting 10 to 50 mm taller each year. But I agree with your point. Thanks.
You seem to not read what you advise me to read. The general subject you focus on seems to be the problem of irrigation of shallow root crops in areas where evaporation is ‘2.5 times’ that of downward drainage. Or perhaps I am not adequately describing the big picture.
This and similar problems is not a new subject to me; I was familiar many years ago with tools to ‘break up hardpan’ to avoid trapping water at the surface. I do not mean to suggest that I am up to date on this.
Trees however, should have a very different situation. Here, the assumption is deep root systems where arrangement for that drainage would be part of the ground preparation for planting.
Generally speaking, there are many problems that I do not think necessary to examine deeply, when it is apparent from on-going agricultural success that these problems are not insurmountable. At this stage of the discussion I depend on existing ‘best practices’ in agriculture that are demonstrated to be successful. Does that mean that there are no problems? Of course not.
Perhaps I have not been clear about the order of things that I am suggesting, meaning that standing forests are first order, shallow root crops would be secondary as a way of paying for the tree investment, and biochar would be a third order mop-up activity to minimize waste.
Yes, the comment box with strung out information is not the best format for writing, but it seems to be good for getting reactions.
A couple of miscellaneous items; I was wondering if there are venues or publications where scientists who’ve, in the past, predicted problems and tried to get them dealt with — successfully or not — have discussed the experience.
“…. Judith Kurland has pointed out that in the health and environment arenas, the PP is better received and more consistently implemented in Europe, where it was formalized into environmental policy in the 1970s, than it is in the U.S. In our country the manufacturers of tobacco, lead paint, petroleum, pha r maceuticals, asbestos, among others, have at times sought to withhold, stall, or doctor vital epidemiological and biostatistical information while simultaneously arguing that “all the data are not in” (Kurland, 2002: 499).
“Kurland rightly contends that action in the face of “informed” uncertainty is the cornerstone of the PP – and I add of good-faith –(as opposed to faith-based) public health. We are touching upon the fault line of a narrow versus a broad conception of the public health mission – just think of our current debate in this country about climate change, also known as global warming. Is it a public health issue? ….”
So why would you pronounce it futile to do things that would avoid the catastrophe your reference predicts if we do not do those very things?
I would agree that much of the popular stuff like windmills and solar panels are probably trivial measures.
I suggest govt. encourage conversion from electric appliances to natural gas, which would actually matter quite a lot, but that gets nary a nod. Our PG&E (Northern CA electric utility) would rather bamboozle around with smart meters and rate fixing tricks. SDGE (Southern CA) has been requesting a program where people sign up to not use air conditioning 14 days a year, those being the days they really need them.
By the way, something is really wrong here, where PG&E workers were “stuck in traffic’ for 90 minutes while people were dying and a whole neighborhood was burning because nobody was around to shut off the natural gas pipeline. Was there a ripple of criticism? Only from me it seems, and that is not much of a ripple.
It’s more complicated than a linear/nonlinear dichotomy. Long term SLR will likely be dominated by a relatively small amount of fairly unique (due to geographic differences) processes no one understands well enough to predict (given a temperature scenario) with any confidence.
What is to be done then? To the extent that quantitative methods are used to come up with figures, they are usually extrapolations from the SL record or simply assumptions which do not capture the nonlinear properties of the systems involved as opposed to modelling based on physics.
The Dutch paper you’re talking about tries to guesstimate figures for some plausible dynamic changes in the 20th century and justifies that by presenting them as low-probability/high-impact scenarios but they end up with a figure lower than V&R which doesn’t include low-probability events. This goes to show how much we are in the dark.
Then the Dutch paper largely extrapolates the 22nd century from the 2100 SLR rate. Whether this really captures the low-probability risks is left as an excercise for the reader. I think the evidence presented in the paper could justify less conservative figures but the take home point is really that they don’t think many key processes are understood well enough to be quantified.
I have no idea what the German paper is about but that figure for 2300 must be about their assumptions than anything else.
You keep presenting grandiose plans based on unproven methods which are already an order of magnitude off or more on paper as alternatives or solutions. Nothing we can do with biomass is an alternative to seawalls. And nothing we do with biomass could prevent potential climate catastrophes such as NCAR’s global drought without deep emissions cuts. If you think I’m wrong, show your numbers!
What appliances do you want to convert to natgas? AAnd in California of all places? Come on… Have you looked at costs and efficencies? Then compare the potential savings to the electricity wasted by the commercial sector in California (especially at peak load)…
Comment by Anonymous Coward — 9 Nov 2010 @ 5:41 AM
JB 488: I would agree that much of the popular stuff like windmills and solar panels are probably trivial measures.
BPL: Of course you would. You want people to use fossil fuels with the cogen equipment you sell. Same motivation as Exxon-Mobil, really.
“there is no reason why a new seawall in Texas has to be much taller than the post-1900 seawall.”
No “scare-quotes” for “no reason?”
Anyway, the same “thought-process” was apparently in place for the construction of the replacement levees in New Orleans. I guess the facts that multiple billions of dollars were being invested, and that failure of the previous system killed a couple of thousand people, weren’t enough to induce designers to look at the big picture.
I’m starting to warm up to idea that “insulating blanket” is a poor analogy for the general thermal effect of our atmosphere. What do you guys think of “integrating blanket” or “homogenizing blanket” instead?
Can you think of an even better analogy?
#485–Sorry, Jim, you’ve still not presented anything (unless I missed something, which is always possible, of course) to convince me that you’re appreciating the challenges of your scheme to even the correct order of magnitude. No offense intended, but that’s where I’m at.
I still say grow trees where trees like to grow. There, scale is less of a problem: akin to the difficulties encountered when raising lots of weeds or varmints, which even I can do. Why sign up to paddle a hardwood forest upriver for a 1000 years?
Jim Bullis wrote: “I would agree that much of the popular stuff like windmills and solar panels are probably trivial measures.”
I would agree that windmills are a somewhat trivial measure, since the CO2 emissions from grinding grain are a relatively minor part of the problem.
Wind turbines are another story.
Concentrating solar thermal power plants on only five percent of the USA’s desert lands could generate more electricity than the entire country uses — and with integrated thermal storage can provide 24×7 baseload generation.
The commercially exploitable on-shore wind energy resources of only four midwestern states could likewise generate more electricity than the entire country uses.
Then there is the huge potential for distributed PV. Then there is the vast offshore wind energy resource.
Wind and solar are not “trivial”. They can easily provide far more energy than we currently use, in perpetuity.
“Easily” becomes a bit harder if real costs are involved. Maybe, these will come down a lot, and then we can talk some more.
In the meantime take a drive over the Altamont pass in Northern CA and Tehatchapee pass in Southern CA and explain these bogosities that were put in place 30 years ago, mostly as a mining project, mining that is, of public money.
Then lay out the details of any of the larger projects. Pickens figured out it was no go. There is a project in Oregon where the developer said he never would have done it without a large subsidy that was foisted on the public, and they were just last year realizing they had been had. Sorry, no detailed references, but the Portland newspaper carried the expose on the Oregon trickery around Halloween last year.
These exercises should tell us that such enterprises will not scale up to non-trivial size.
489 anon anon anon, Kevin McKenny, Secular Animist, Hank Roberts, David B. Benson, BPL (and many others)
You appropriately ask for detail. So reviewing a bit, I found that let Brian Dodge misframe the discussion by not adequately dissecting his post. He set the requirement that the new forests would have to capture and sequester the entire coal CO2 output.
I began with the premise that forests would be superior to the EPA planned ‘carbon’ capture scheme. This was to be imposed on newly permitted power plants, and then somewhat on modernizing projects for old systems. This translates somewhat vaguely to a requirement for forests. But it certainly does not imply capturing the entire CO2 output from burning coal.
I should realize that there are many who would have us ban coal outright, as California appears to have done. So I should realize that this is not an equivalent to what I am trying to accomplish, and I should make that clear.
My suggestion involves a 3000 mile long aquaduct with 10 mile branches to each side sufficient to irrigate 60,000 sq. miles of forests. This is 1.5e6 ha. Estimating 100 trees per ha gives us 1.5e8 trees and each being of dry forest mass weight in 25 years of 10 tons we are now at 1.5e9 total dry wood mass. About half of this is elemental carbon, so that is 8e8 tons of carbon, which is about 3e9 tons of CO2. 3e9 tons of CO2 captured over 25 years gives us .12e9 tons CO2 captured per year.
Using 2e9 tons CO2 from 2e9 mWhr coal fired power per year according to Brian Dodge (#389 of the Cucinelli post:
we can conclude that 6% of CO2 from coal in power plants could be captured. That would appear to be enough to make the CCS planning by the EPA unnecessary.
To make a fair comparison with renewables, consider the cost of providing 6% of 2e9 mWhr of power from renewables. The forest plan here roughly outlined would make power from coal equivalent to renewables, in that amount.
I should have noted that, though it is obvious just on the basis of cost.
However, I am thinking about clothes dryers, and cooking appliances which are still very substantial heat users. And conversion hardware is readily available. But this conversion would mostly benefit CO2 situation, and is not quite so apparent in the cost decision.
489 anon anon anon, Kevin McKenny, Secular Animist, Hank Roberts, David B. Benson, BPL (and many others)
(I slipped on an exponent in my previous reconstruction of original calculations. Now it appears that there is a fair amount of margin, as I had previously thought.)
You appropriately ask for detail. So reviewing a bit, I found that let Brian Dodge misframe the discussion by not adequately dissecting his post. He set the requirement that the new forests would have to capture and sequester the entire coal CO2 output.
I began with the premise that forests would be superior to the EPA planned ‘carbon’ capture scheme. This was to be imposed on newly permitted power plants, and then somewhat on modernizing projects for old systems. This translates somewhat vaguely to a requirement for forests. But it certainly does not imply capturing the entire CO2 output from burning coal.
I should realize that there are many who would have us ban coal outright, as California appears to have done. So I should realize that this is not an equivalent to what I am trying to accomplish, and I should make that clear.
My suggestion involves a 3000 mile long aquaduct with 10 mile branches to each side sufficient to irrigate 60,000 sq. miles of forests. This is 1.5e7 ha. Estimating 100 trees per ha gives us 1.5e9 trees and each being of dry forest mass weight in 25 years of 10 tons we are now at 1.5e10 tons total dry wood mass. About half of this is elemental carbon, so that is 8e9 tons of carbon, which is about 3e10 tons of CO2. 3e10 tons of CO2 captured over 25 years gives us 1.2e9 tons CO2 captured per year.
Using 2e9 tons CO2 from 2e9 mWhr coal fired power per year according to Brian Dodge (#389 of the Cucinelli post:
we can conclude that 60% of CO2 from coal in power plants could be captured. That would appear to be more than enough to make the CCS planning by the EPA unnecessary.
To make a fair comparison with renewables, consider the cost of providing 60% of 2e9 mWhr of power from renewables. The forest plan here roughly outlined would make power from coal equivalent to renewables, in that amount.
Jim Bullis, Miastrada Co. — Now add in the cost of cutting and moving the wood to a burner. I hazard this is much more expensive than simply building and running nuclear power plants, for which LCOE (busbar) of US$0.065/kWh appears obtainable.
You keep posting preposterous figures for your super-forest which would absorb an eye-popping average of 40tC/ha/yr in addition to the carbon which ends up in the forest floor, debris and such. After a mere 25 years, your super-trees would contain more carbon in their wood than the whole biomass of a similarly-sized centuries-old forest. You are an order of magnitude off… at best. People have studied forests, you know. There’s plenty of material readily available online. Hank has posted references to a few papers.
How can offsetting 6% of the emissions from coal possibly be sufficient anyway? You just spoke of forests as an alternative to seawalls!
I’m amazed that people still use electric water heaters in a state which has so much trouble with electricity. Electricity is obviously too cheap. But why convert to natgas? Isn’t much of the state’s housing stock ideally suited for solar water heating? That’s a small fraction of electricity consumption anyway. Calfornia has a cooling problem, not a heating problem.
Wind and solar are not close to providing even a tenth of the energy we currently use. But science tells us it would be prudent to cut emissions yesterday. Orders of magnitude again…
By being unserious about wind and solar, you encourage people to be equally unserious about schemes with even less merit like turning deserts into forests.
Comment by Anonymous Coward — 9 Nov 2010 @ 6:42 PM
Jim Bullis wrote: “These exercises should tell us that such enterprises will not scale up to non-trivial size.”
You really should get out more, and learn about the numerous “non-trivial” utility-scale wind power and solar thermal power plants that are being built now.
I would be interested in comparing coal plants to nuclear plants after coal plants would be made into overall clean systems by combination with standing forests. This would be a low cost, no cost, or maybe even profit making adjunct to coal using systems, that would be very favorably comparable to very expensive ‘carbon’ capture as being planned by the EPA, with cost ‘up to $95 per ton of CO2′. I do not see the forest system adding much to the cost of coal operations, so the present advantage of coal over nuclear would continue.
Once a forest is mature, there would be selective minimal harvesting which would be to keep the forest healthy and secondarily yield wood products of value. Wood harvesting costs would include mop up of waste by turning it into biochar if possible. That would be a charge against the wood products. I think there still would be a net gain. Remember though, biochar would be a third order activity. But there is an ongoing forest management burden. In the end there might be a need for a modest charge to the coal burning power producers.
Still, the bigger economic gain would be from ordinary agricultural crops that would be enabled by the availability of water along the way.
Nuclear power costs need to be quoted in terms of capital investment and cost of ongoing operations. I am not sure what your number includes. The same rule applies to the forest concept, but I have not gotten that far with cost analysis. My intuitive level thinking is that the upfront costs would be recoverable from water charges over the long term, or from sales of whatever harvested products.
Do you dispute the 100 trees per ha, the 10 tons per tree estimate, or what?
Do you know the mass of a century old forest? At a certain mature age, it does not go up much, so being centuries old does not necessarily mean that much.
I look at all references offered. Some are applicable. Some not. Much of the talk is about capture rates for mature forests, since that is the main concern where deforestation is being addressed.
By the way, I did not count below ground storage of carbon compounds. Please include that for the old forest. My estimate is about 11 times higher, per ha, than Pres. Hu quoted for their 15 year project plan. I am talking about 25 years though, and I do not know how his forests would be arranged and managed, including the harvest cycles and watering intensity.
Sorry about the 6% but the correction as shown in #503 makes it 60%, which is more exciting. In fact, if we double forest mass by counting root structures, we could take care of the entire coal problem.
Many houses here had rooftop solar water heaters 30 years ago and most have been torn down. I don’t know anybody who would want an electric water heater, but they still sell them at the hardware store.
Show me the project that is valid without public subsidy, and is expected to make it as a competitive source, where the feed-in tarif is based on the lowest cost generating option having available capacity. A system honestly competing on that real basis would be exciting. Until that kind of system is offered, we are just being self delusional.
Re 505 Anonymous Coward – “By being unserious about wind and solar,”
No I’m quite sure he’s serious. When you say “Wind and solar are not close to providing even a tenth of the energy we currently use“, are you refering to the present supply of energy from those sources, the present manufacturing rate, or the present price? Obviously the first is a condition that can be changed, at a rate that depends on how fast the second can be ramped up; the third is important to how easily those can be done, but the third also depends on those things (increasing experience with the technology and economy-of-scale).
Your oblique wording merits some thought in, “By being unserious about wind and solar, you encourage people to be equally unserious about schemes with even less merit like turning deserts into forests.”
Does this mean I have to endorse silly stuff in order to get people to be serious about forests? My thinking is that all this silly stuff is a distraction from real work. Promoters discovered these gold mines in the 1970s and not enough has changed to bring real merit to the unmitigated bogosity of the origins of such games. But someday, something might advance to a convincing technology.
The word ‘deserts’ was never mine. If I remember right, that was a strawman hostile assumption by Brian Dodge (#389 Cuccinelli), though I was ok with it as a relative comparison. He based his analysis on sequestration rates reported for mature Southern forests, which I believe sequester very little, and that led him to an error of an order of magnitude in land requirement. He also required water from irrigation in an amount equal to rainfall.
Comment by David B. Benson — 9 Nov 2010 @ 10:54 PM
Anonymous Coward wrote: “Wind and solar are not close to providing even a tenth of the energy we currently use.”
Say, are you the same guy who once wrote “Cell phones are not close to providing even a tenth of phone service”?
Or maybe, “Personal computers are not close to providing even a tenth of the world’s computing capacity”?
Perhaps you scoffed at the idea that a few clattering automobiles could have a significant impact on a horse-based transportation system?
The “serious” fact is that we have a vast and never-ending supply of solar and wind energy.
And we have mature, powerful technology to harvest it.
And that technology is already being rapidly deployed world-wide at every scale from giant utility-scale wind farms and CSP power plants to off-grid PV systems for rural villages in Africa and India (who desperately need electricity and have no prospect of ever being connected to a utility grid). In fact, wind and solar are the fastest growing sources of new energy generation in the world, and have been for years, and are growing at record-breaking double-digit rates year after year.
And the technology is improving rapidly, from wind turbine designs to ultra-cheap, high-efficiency PV materials that can be incorporated into everything from windows to the surface of automobiles to clothing.
And very simple policies are already well-known that can accelerate the rapid deployment of wind and solar even more — from feed-in tariffs to tax credits.
Those are the “serious” facts.
We absolutely CAN transition to an entirely solar & wind (and geothermal and biomass and hydro) powered energy economy, and we can do it much more easily, much cheaper, and much faster than most people realize.
Whether we WILL do so depends on whether we get “serious” about it. And I am very serious about it.
One of the great things about renewables today is how the technology advances, the economics improve, and deployment grows even as various folks on blogs pontificate about how it’s a scam, can’t be relied upon, is negligible, and yada yada yada. My perception is that the doers are making fools of the naysayers.
Particularly interesting is the case of China; a couple of years ago, this humble deponent was trying to point out to the “whatever we do, China and India will negate, hence let’s do nothing” crowd that China was transparently aiming to become a world leader in the manufacture and export of renewable energy technology. Others, more knowledgeable than me, were warning that “China will eat our lunch” in the renewable industries.
Four or five years ago, I was just learning about the existence of flywheel energy storage. It sounded crazy to me–surely the energy capacity couldn’t be sufficient, I thought. Today, they are being deployed commercially by Beacon Power of Massachusetts in 20-MegaWatt modules–though Beacon is still, admittedly, losing money at a pretty brisk clip. But then, they have to build capacity, and that is going to cost money. They don’t seem to have trouble obtaining investors.
You can access some nice pics of the construction of their 20 MW Stephentown, NY plant–the corporate white hope, it looks like; also the quarterly report for 3rd quarter 2010, for financial types.
Yes, there is an order of magnitude to go. But, unlike the case with overall CO2 mitigation per se, movement in the right direction is not only discernible but spectacular. In the aftermath of an election which has the potential to set US GW policy back a decade or more, it’s worth remembering that.
I think the advance in computing power is a great analogy to use for people who insist that we “can’t” produce enough power from renewables. Just look at the cost, size, weight and the power requirements of a computer of just 10 years ago that would be equivalent to a standard laptop today. Let alone 15 or more years ago.
The advance in computer power owes to Moore’s law–which states that transistor density doubles about every 18-24 months. This guarantees that you will see exponential growth in computing power, and after 30 years, the result is indeed impressive.
When Moore proposed his law, it had a technological basis–scaling of CMOS technology. There was a recipe for building smaller transistors. The CMOS scaling recipe basically failed in the ’90s. However, by that point, it was an economic necessity. The entire electonics industry was predicated on such growth. Since about 1995, Moore’s law has been sustained by new technologies, new materials and new designs, but it has been sustained nonetheless. There is an unprecedented consortium of semiconductors that produces the International Technology Roadmap for Semiconductors (ITRS), which tackles common problems for the industry.
In terms of energy, there is no nice recipe like CMOS scaling. However, there is an exponential relation–Rosenfeld’s Law–which says that energy per $1 of GDP decreases about 1% a year. This pace is much slower. Maybe what is needed is an International Technology Roadmap for Energy (ITRE).
” Pickens figured out it was no go. …” Jim Bullis, Miastrada Co.
What did he figure out?
He’s still building two farms: one in Canada and one in Minnesota. As for the canceled farm in the Texas Panhandle, he says he will build it once the transmission lines are wherever he wants to be between planned and built.
You are absolutely right. The public pays. The question is how much.
If we are careful to consider the difference between nameplate capacity of wind systems versus the actual yield that is about 20% to 30% of peak, and include stand-by generating capacity using relatively inefficient natural gas peaking type generators, and then put a reasonable depreciation factor on the wind towers themselves, things do not look so cheap. All things considered, the long term costs of wind power do not compete at all with the cost of coal power.
Irrigated forests cost would require upfront money also, but this would be recoverable by products sold from the forest, as well as crops from the ancillary agriculture that would also be enabled by the irrigation system. Assuming this to be true, upfront costs would be recovered so the cost to the public for power would stay the same as it is, that being the cost of operating the coal fired power system.
The Springer Desert project was badly constructed, though it provides interesting comparative data. For openers, the $1 tax per gallon that they proposed is a political non-starter.
I see no potential in growing forests in Saharan deserts or anywhere that does not have existing fresh water supplies that can be drawn from; making fresh water is nowhere close to being efficient and cheap. Were freshwater supplies possible at a reasonable cost, there would be no need to dry up Arizona to enable vast supplies to Los Angeles. It simply does not make it for cost effective irrigation, even where there is a good paying crop to be gained from water availability.
They brushed aside the idea of forests in temperate regions on the basis of albedo effects, though this would only apply in initial years of growth. Logic is not developed to explain why this would not be an issue in the sub-tropical zones they favor.
Then they use a tree density of 1000 trees per ha, where I planned only 100, so the whole system is different. They would harvest and mostly burn the whole crop after a “few decades” where I would preserve the sequestered CO2 in carbon compounds permanently in standing forests, harvesting from mature forests only to the extent needed for effective forest management. And then the objective would be to maintain wood mass in permanent forms; and ultimately the scrap would go into biochar which also accomplishes permanent CO2 sequestration.
Their selection of eucalyptus as the tree of choice is an obvious flaw known to Californians who are familiar with dense eucalyptus as comparable to an open can of gasoline. We also know that the intended use of that wood for railroad ties turned out to be unworkable, or at least comparatively so compared to the oak that is still mainly used. It is even unpopular as firewood since it produces excessive pitch in chimneys. But the main issue is the short harvest cycle whereby the sequestration action made void by consumption of the wood mass in ways that would turn it back to CO2.
On the biochar subject, you pointed out that there would be a transportation cost in the biochar production operation. It might be worth looking at using the aquaduct as a canal for low cost transportation of scrap wood to central burning points where the biochar process would be carried out. Still, I do not see this as a high volume operation.
You are quite right about digital electronics for computational purposes being different from energy devices.
There have been slow to barely perceptible advances in silicon power transistors over the last 50 years, where large masses of silicon and big heat sinks determine sizes of devices.
Once we created digital logic circuits, we enabled a crop of people who would never be required to think about thermodynamics. That is an impediment to the wonders of Silicon Valley leading us into green bliss.
Rosenfeld’s attempt to relate use of energy to GDP is a little light headed. It is also lightheaded to point at California for having made great strides in reducing energy use. In energy efficiency it is unclear whether he did more harm or good.
In 1950 to 1960, the building codes were lax, energy in the form of natural gas was almost free, and cheap construction practices dictated unbelievably sloppy home construction. I know, I live in a 1960 built house. In the rest of the country, a greater degree of sanity prevailed. Thus, California had a long way to go just to catch up with normal. Thus, efficiency improvements were low hanging fruit, so the gains made by California looked good in comparison to colder climate states where they had always paid some attention to insulation.
As we moved into the 1970s building codes were tightening up, and we stopped seeing houses with about 50% glass windows, single paned. Energy prices were going up so things started shaping up even more in California.
In the 1990s the price of natural gas started upwards and energy use became a lot more of an issue.
Thus, we have obvious reasons why energy use was going down, at the same time that GDP was going up. Noting a relationship seems to have turned into a ‘law’, and this is rather hard to understand.
I wonder if Rosenfeld was asked if this ‘law’ could be named after him.
We want to recognize the push for appliance efficiency which seems to have been a Rosenfeld credit. However, it is perplexing how a physicist could have missed the mark in characterizing coefficient of performance of refrigeration systems by ignoring the energy actually consumed in generating electricity. The upshot of this is that vapor diffusion devices were wiped out of existence, though they are actually superior in energy efficiency, when correctly defined. Not only is the efficiency advantage lost, the CO2 impact of coal as the heat source rather than natural gas is also significant.
Kevin, they don’t say, “it’s a scam, can’t be relied upon, is negligible.” They say things like the reliability (and availability) is a major probable and inhibitor; some say it is miles away from getting beyond negligible; only outliers say “scam.” These things are pointed out to put a bit of realism into what is often wild rose-colored-glasses exaggerated optimism. Like the flywheel technology. Recovering from a few seconds of cloud cover for a small- or even medium-sized PV system is light years away from 24/7 grid availability.
I’m not meaning to throw cold water on your optimism. Optimism is good and can’t hurt the effort. But the engineering/scientific assessments ought to carry some degree of realism, at least in these circles.
The IEEE is stepping up its involvement in energy-related technologies with its latest working groups.
The engineering organization announced today the creation of a standards efforts for grid storage and another for measuring greenhouse gas reductions from renewable energy and energy-efficient products.
The grid storage effort, part of the IEEE’s P2030 smart grid interoperability working group, will create guidelines for defining the technical interfaces between energy storage systems, such as communications to other smart-grid equipment.
The effort, expected to take about two years, will focus on hybrid storage systems that combine different technologies, such as batteries, ultracapacitors, and flywheels.
“There are [known] ways to get around the weaknesses of each storage technology to combine them into something useful,” Dick DeBlasio, a P2030 member and chief engineer at the U.S. Department of Energy’s National Renewable Energy Lab, told EETimes.
Grid storage is increasingly viewed as an important technology to make the electric grid more reliable and clean. There are already several pumped hydro facilities on the grid, but utilities are now starting to use large batteries or flywheels as a replacement for natural gas plants to stabilize the grid. Several companies are pursuing bulk storage for wind and solar plants, too.
See also the photo gallery at Energy storage on grid heats up which illustrates a variety of grid storage technologies, including batteries, flywheels, compressed air, pumped hydro, and thermal storage with molten salts.
There is a huge amount of work being done in this field that most people who think that large-scale grid storage is “light years away” don’t know about.
Guys, guys. I’m not thinking that they’re the same thing.
I have in mind discussing the issue with the unimaginative who would’ve said the same kind of things about computers – What, me? Lug around one of those things? Are you mad? (And now they fit into handbags.)
SecularAnimist, the IEEE formed a working committee to study interface standards among source, storage, and transmission lines??? This will get national grid-level storage up and working by when, would you guess?
A large PV farm in CO delivers about 50 MWh to about 1500 homes over about a useful 6-plus hour day from a 8.3 MW system. If the off-peak requirements averaged 25% (might be lower I suppose — I’m just doing back of the envelope ballpark) delivered by batteries, they would have to provide about 38 MWh each day (less if the 25% figure is too high). This is pretty close to today’s leading edge of battery utility for 0.0004% of the nations electric output. (Sounds piddly on the surface, but in context is actually pretty good, IMO.) The odd thing here is that about 3/4 of the PV output now has to go recharge the batteries, putting a monkey wrench into the analysis. Flywheels are pretty helpful for smoothing a couple of seconds of transmission anomaly, but are virtually useless for high energy long-term storage. We are a long long ways from national grid level storage that is available and reliable with sufficient useful capacity. More like light years than just around the corner.
Again, I am not pooh-poohing any of the renewable/storage efforts — I think they’re great. Nor do I fault the scientists and engineers involved for being optimistic and even hyperbolic; I think that’s the way they should be — helps their motivation and success. But in 3rd-party discussions on the objective technical level, like in RC (but maybe not so much in political discussions…) I think the hyperbole is not helpful and even the optimism should be tempered a bit.
Rod said: ‘Kevin, they don’t say, “it’s a scam, can’t be relied upon, is negligible.”’
Uh, yes, they do Rod–all the time. I know, it falls to me (and others like me) to try and counter the disinformation. And disinformation it is; the contention is not that proponents are being unrealistically optimistic, the contention is that these technologies are a total waste of time and money.
Not that you’re saying that. But there’s lots that are. Trust me.
JB 512: all this silly stuff is a distraction from real work. Promoters discovered these gold mines in the 1970s and not enough has changed to bring real merit to the unmitigated bogosity of the origins of such games. But someday, something might advance to a convincing technology.
BPL: Denmark–20% of electricity from wind turbines, 50% by 2020. Portugal–45% of electricity from renewables. Germany–photovoltaic coming on-line so fast they’re worried it will disrupt the grid. Need I go on? Get your head out of that dark space.
My son is in medical school. The area is covered with wind farms. There is a parts depot just outside the city: blades, turbines, housings, towers, etc. spread out over about an 1/8th section. Trucks with parts coming in; trucks with parts going out, never ending. T. Bone better get his butt in gear, or there isn’t going to be room for him.
Between Sweetwater and Post, they put them up on some beautiful ridges, cap rock I guess.. Butt ugly. Jim should read about the history of Post, Texas. Paddling a big idea up river. Not easy.
With respect to wind power, the 1970s was the “Armature Hour” when compare with what is happening now.
Thanks for the information. It’s sad they don’t have a complete team.
I’m an American, Dan. NASA and NOAA – they’re my side. We’re winning because we have a point guard who can drain it from way up inside the Arctic Circle. His shots are coming in hot now. Back in 1998, our Arctic Circle gunner was ice cold.
Moved from 399 Adelady On Science, Narrative, and Heresy
The Great Artesian Basin is very interesting to the scheme of a sensible water distribution. Perhaps it would be possible to use this basin as the water paths needed.
Yes, there are critical issues, which are probably being worried about as we speak, as to preventing salt water incursion. This happens more when there is dire need for water.
We have the same problem on a much smaller scale in the Santa Clara Valley. (Yup, some people think there is Silicon down there.) The San Francisco Bay delivers salt water right to the edge of the aquifier that runs from the Bay to the mountains. Over 50 years of water management has been directed at keeping the aquifier charged, so we have reservoirs in the mountains that deliver water to percolation ponds throughout this region. As a result, we continue to get drinking water from wells almost to the Bay, and the threat to tree root systems is also held back.
It is difficult to get over shock and horror (not awe) when confronted with significant thoughts about action to solve problems. But I would be interested in thoughts were you to actually want to fix the CO2 problem.
RE your #416 on Science, narrative, and heresy, CM
No, the Labrador Sea data is not a comprehensive proof. I have variously pointed to vertical mixing by hurricanes, storm surges, etc. Massive events such as Tsunamis heave much deep water upwards. Whatever the mechanisms, the fundamental fact is that there is a massive place for heat to go, and it does not take much temperature increase in the deep ocean to erase a lot of temperature increase in the atmosphere.
I am strongly interested in a comprehensive analysis of vertical mixing in the oceans, and the Argo float system seems to be a path to getting this done. I am surprised it has not yeat appeared.
In my #390 (from Science, variations, and heresy) you might note that I quoted the inline comment by Gavin from April 2008 to point out that the models do not seem to have adequate interface with deep ocean effects. Thus, I argue that predictions could be overstating the atmospheric temperature. And I say over and over, this does not mean that the oceans will not show a reaction through rising sea level.
We shouldn’t be afraid of ideas just because they are difficult or expensive.
However, Jim Bullis – you tend to underestimate ideas in various different ways. You think nobody has considered such ideas before, or you ignore good reasons for rejecting the idea, or you gloss over the practical difficulties.
Be pragmatic. Accept your limitations, and try to collaborate in improving and filtering ideas, rather than being a single, lone, failed wannabe visionary. Don’t be an instant expert. Seek out experts, and ask them stuff. Stand on the shoulders of giants, and get used to ditching ideas that don’t work out.
Australia’s already on top of this one Jim. In my view we should leave the basin alone to do its millions-years-cycle thing …… but there are a couple of spots on the extreme southern coasts where millions years old water literally gushes into the sea. Using some of that to replace what we take from dying rivers or to replenish the basin itself, because it’s been over-extracted for a very long time, would be a good idea. But only if hydrologists and oceanographers can assure us that we won’t create worse problems at the shore and in the southern ocean.
Comment by David B. Benson — 15 Nov 2010 @ 12:22 AM
Glad to hear all is in hand in Australia, though I wonder if you can get enough extra water to grow new forests.
Finding old water gushing into the sea seems like quite a breakthrough though. Are you talking about ‘worse problems’ in comparison to global warming?
Will the basin deliver up water for the rest of East Australia for a million years, even though the land above is bone dry? Well then, there should be some to spare for about 100 years to plant new forests all over the place, huh?
Grow new forests? We never _had_ anything you might call “forests” except along the Great Dividing Range, the temperate rainforests of southern Victoria and Tassie and the very impressive jarrah stands in southern WA.
http://en.wikipedia.org/wiki/File:Australia_satellite_plane.jpg This picture is an excellent display of the fact that most of Australia’s 7.5 million sqkm is either desert or semi-arid. In fact it vastly overstates the ‘green’ area of South Australia, I suspect it was taken in late winter or early spring. That’s the only explanation for all that green on Eyre Peninsula. And the white patch in the middle? That’s the salt flats of Lake Eyre.
“… Currently, predictions emphasize one or the other of two contrasted alternatives: abrupt cooling caused by a shutdown of the thermohaline circulation (the “ocean conveyor”) or abrupt warming caused by copious outgassing of methane. Both arguments (the former from oceanographers and the latter from geophysicists) are equally persuasive, and I have chosen to explore the methane alternative, because I am familiar with an area (the Beaufort Sea and Mackenzie Delta) where outgassing has recently (2007) been detected and is happening now: in the Arctic Ocean and the Canadian Arctic Archipelago, where disappearance of the ice will affect currents, temperature, thermocline, salinity, upwelling, and nutrients, with consequent effects on the zooplankton….”
Since Jim’s discussion continues in more than one thread, don’t miss Gavin’s inline response to Jim:
[Response: You have misunderstood the comment completely. I was referring to a specific calculation which requires an fully equilibriated simulation. Most experiments – and this includes all transients for the next few decades – do not require this and so coupled models are routinely used. – gavin]
Jim, please, get a blog.
If you do that, people be more patient and helpful, at greater length, in one coherent discussion.
Scattering this through various RC topics is giving people the wrong impression about your thinking.
Getting it together — would be a good move.
> vertical mixing by hurricanes, storm surges, etc
There seems to be interesting work being done on this: Might tropical cyclones, by cooling the surface and mixing heat down, be a major driver of poleward heat transport, and could this be a positive feedback to global warming? E.g.:
Sriver, Ryan L., and Matthew Huber. 2007. Observational evidence for an ocean heat pump induced by tropical cyclones. Nature 447, no. 7144 (May 31): 577-580. doi:10.1038/nature05785.
Herweijer, Celine, Richard Seager, Michael Winton, and Amy Clement. 2005. Why ocean heat transport warms the global mean climate. Tellus A 57, no. 4 (8): 662-675. doi:10.1111/j.1600-0870.2005.00121.x.
Here’s a recent one:
Sriver, Ryan L., Marlos Goes, Michael E. Mann, and Klaus Keller. 2010. Climate response to tropical cyclone-induced ocean mixing in an Earth system model of intermediate complexity. Journal of Geophysical Research C: Oceans 115 (October 20): 9 pp. doi:201010.1029/2010JC006106.
I think you are using the term “positive feedback” differently than the control system sense, where ‘positive’ means things go crazy.
Cooling the surface and mixing heat downward is how a negative feedback works to keep things from going crazy.
The terminology is completely crazy, since your sense of the word ‘positive’ fits with normal people’s use of the word; not the way engineers defined it for their purposes. I think the climate scientists grudgingly go with the engineers on this one, but only because Bode was at Bell Labs, and that has to count for something. (See Chris Colosse definitions a few weeks ago.)
Jim Bullis, Miastrada Co. @556 — Positive feedback means a system which amplifies the imput (forcing): http://en.wikipedia.org/wiki/Positive_feedback
“Things go crazy” only if the positive feedback is too large, which is obviously not the case for Terra’s climate.
Comment by David B. Benson — 16 Nov 2010 @ 5:22 PM
558 David B. Benson,
At least you have the direction right.
In control system theory, all inputs are amplified, but the negative feedback serves to bring the summing point to near zero. In general, positive feedback results in instability since the amplifier provides a high gain amplification of whatever is at the summing point.
This being the origin of the term, it seems to me that definitions otherwise are a bit like 6th grade science.
True, nobody owns words, but when a supposed science adapts well established terminology from another field, it is not very good science when they use it inconsistently. A definition is never wrong, but when used inconsistently with the way it is used by others, especially when those others might be important to bring into the club, that definition can be called ‘not useful’. I think that the real scientists understand how important it is to communicate to the people who bring solutions to the table.
When you provide advise, please remind yourself that you have no idea of my background. That does not mean I do not appreciate it when you provide useful information.
I’m using “positive feedback” in the ordinary climate science sense, as discussed most recently on Chris Colose’s threads. The first reference I gave surmises that increased tropical cyclone activity, by mixing down heat, may lead to increased poleward ocean heat transport, which may lead to a rise in global mean temperature. This is based on the argument in the second reference, that ocean heat transport moistens the subtropical atmosphere, reduces subtropical and mid-latitude low cloud, and melts high-latitude sea-ice, thus amplifying global warming by enhancing water vapor and albedo feedbacks. Hence although the tropical sea cools locally in the wake of a cyclone, globally the planet warms. (The third reference finds the main impacts to be confined to the Pacific Ocean.) This may not be a very precise precis — I haven’t had the time to read these carefully.
The first reference says, “Our results indicate that tropical cyclones are responsible for significant cooling and vertical mixing of the surface ocean in tropical regions. Assuming that all the heat that is mixed downwards is balanced by heat transport towards the poles, we calculate that approximately 15 per cent of peak ocean heat transport may be associated with the vertical mixing induced by tropical cyclones. Furthermore, our analyses show that the magnitude of this mixing is strongly related to sea surface temperature, indicating that future changes in tropical sea surface temperatures may have significant effects on ocean circulation and ocean heat transport that are not currently accounted for in climate models.”
I see nothing here about ‘may lead to a rise in global mean temperature’. That seems to be your conjecture.
I see this as a process where vertical mixing is described such that ‘heat is mixed downwards’. It is my reasoning that heat mixed downward is a way that heat is removed from the atmosphere.
The authors announce cooling and vertical mixing, but then make unwarranted suppositions on the implications of that. I look for observed data and rational discussions of the implications. What goes beyond this I tend to ignore.
Heat mixed downwards will affect global heat transport according to how changes in deep temperatures impact the global circulation system. Whether it ultimately adds to heat held deep in the ocean, and therefore stays out of the climate, is not addressed by the authors. I say this is the interesting question.
The use of the term \positive feedback\, as on Chris Colose’s fine threads, is exactly the same as in a regenerative amplifier. Which doesn’t go crazy, to use your term.
Comment by David B. Benson — 17 Nov 2010 @ 5:12 PM
In short form ‘going crazy’ equates to ‘becoming unstable and starts to oscillate’.
If the climate system gets to the point that it acts unstably and starts to oscillate, we all would need to run and hide.
I did not need to read the Wikipedia explanation to know this. Actually, I built a regenerative short wave radio receiver in the 8th grade. But in developing amplifiers for control systems, that was considered an utter failure of design.
Jim Bullis, Miastrada Co. @564 — Too bad nature is a utter failure then because that is approximately how it works. And no, nature is not a control system. And yes, it has been oscillating for the last 2.58 million years, due to changes in orbital elements, not due to internal instabilities per se.
However, both ENSO and AMO are quasi-periodic oscillations said to be internal variability. There are more examples.
Comment by David B. Benson — 17 Nov 2010 @ 10:27 PM
Next, I’ll look at Wegman et al’s “reproduction” of McIntyre and McKitrick’s simulation of Mann et al’s PCA methodology, published in the pair’s 2005 Geophysical Research Letters article, Hockey sticks, principal components, and spurious significance). It turns out that the sample leading principal components (PC1s) shown in two key Wegman et al figures were in fact rendered directly from McIntyre and McKitrick’s original archive of simulated “hockey stick” PC1s. Even worse, though, is the astonishing fact that this special collection of “hockey sticks” is not even a random sample of the 10,000 pseudo-proxy PC1s originally produced in the GRL study. Rather it expressly contains the very top 100 – one percent – having the most pronounced upward blade. Thus, McIntyre and McKitrick’s original Fig 1-1, mechanically reproduced by Wegman et al, shows a carefully selected “sample” from the top 1% of simulated “hockey sticks”. And Wegman’s Fig 4-4, which falsely claimed to show “hockey sticks” mined from low-order, low-autocorrelation “red noise”, contains another 12 from that same 1%!
Finally, I’ll return to the central claim of Wegman et al – that McIntyre and McKitrick had shown that Michael Mann’s “short-centred” principal component analysis would mine “hockey sticks”, even from low-order, low-correlation “red noise” proxies . But both the source code and the hard-wired “hockey stick” figures clearly confirm what physicist David Ritson pointed out more than four years ago, namely that McIntyre and McKitrick’s “compelling” result was in fact based on a highly questionable procedure that generated null proxies with very high auto-correlation and persistence. All these facts are clear from even a cursory examination of McIntyre’s source code, demonstrating once and for all the incompetence and lack of due diligence exhibited by the Wegman report authors.
(sigh) Well, I read the whole thing, summarized the part of their conclusions that I thought had a particular bearing on your argument and that you might find surprising, and explained how it fits with the paper you thought contradicted it. Here is the concluding paragraph in full:
Our analysis suggests that changes in global cyclone frequency, duration and/or intensity are closely related to the amount of heat pumped into—and available to be subsequently transported by the oceans. This relationship may have implications for changes in heat transport associated with past and future climate change. Extrapolation of our results suggests that future increases in tropical temperatures may result in increased dissipation, mixing, heat storage, and eventually heat transport. Moreover, this positive response in transport might feed back on climate by redistributing heat poleward, diminishing the Equator-to-pole temperature gradient, and raising global mean temperature. We have provided some evidence that cyclone-induced mixing is a fundamental physical mechanism that may act to stabilize tropical temperatures, mix the upper ocean, and cause polar amplification of climate change. It is not included in the current conceptual or numerical models of the climate system. Better representation of cyclone winds and the associated mixing in
climate models may help to explain the still-vexing questions posed by past climates.
In a nutshell, these people are saying, like you, that tropical cyclones mix down a lot of heat. Like you, they’re giving thought to the effects this might have on global warming, given that warming oceans are expected to make more intense cyclones. But they’re arguing that if anything (big if so far), this would tend to amplify global mean warming, not reduce it.
You should be interested in their reasons for thinking so, since unlike you (and me), they are experts doing active research in the field. They may be wrong, but they’re likely to have thought this through a lot more carefully than you have. For instance, they have thought about where the heat would go after it’s mixed down below the mixed layer and before it has any chance of reaching the deep ocean.
But sure, if you like, please feel free to ignore anything in a paper that strikes you as counter-intuitive. I’ll feel free to treat your opinions on science accordingly.
Authors of report say, “We have provided some evidence that cyclone-induced mixing is a fundamental physical mechanism ”
This is the actual result of their work. And it is interesting.
However, that this ” –may act to stabilize tropical temperatures, mix the upper ocean–” includes the word ‘may’ though it could be probable since it is relatively obvious result of a mixing mechanism.
But then, “cause polar amplification of climate change” is a huge leap since the equatorial mixing is not even quantified as to depth, so that the coupling to the thermohaline circulation is not even a topic of the paper. So we have no idea how much heat gets to the poles due to this mixing. And we have no reason to think this would couple to the atmosphere if it did get there. What about, if it simply caused ice to melt?
They absolutely have not, from their own words, shown evidence that this would “–cause polar amplification of climate change.”
However, I spot with interest that they say about this mechanism, “–It is not included in the current conceptual or numerical models of the climate system.” So the fact that these people are active in the field is relevant to their knowledge of such things, or would seem to be.
But further it indicates they have no idea what they are talking about when they make the seemingly obligatory conclusion about climate change.
I hasten to add, I did not read the whole report, but simply go on the abstract, but here I only analyze the words you quoted.
But then you, CM, tell me that the authors thought about where the heat would go once it got below the mixed layer. Getting mixed below the mixed layer is a little perplexing, sort of self contradictory wouldn’t you say? But you don’t tell me what they thought.
I did not opt to pay $26 to read the rest of this paper since it did not sound particularly useful beyond what was in the abstract. Do you recommend I do that?
Sigh yourself. There has to be some discrimination in what we choose to study in depth. There should be a rule that people providing references should read them first.
Jim Bullis, Miastrada Co. @568 — Almost alwaays you can find a copy of published paers freely available on authors’s websites.
Comment by David B. Benson — 18 Nov 2010 @ 9:55 PM
One of the most common fallacies held sacred by denialists is that CO2 concentrations of 384 ppm are too low to have any impact on the climate.
The caffeine content of popular energy drinks, such as Monster and Rockstar is 200 ppm or less. The average human body holds 10 gallons of water, so consuming a 12 oz energy drink would result in a caffeine concentration of approximately 4 ppm in the body.
Therefore, according to denialist logic, consuming 96 12oz energy drinks couldn’t possibly have an effect on the human body because the concentration of caffeine would be too low.