Coauthors are invited for a paper on a topic at the bleeding edge of radiative heat transfer and animal physiology- the Underground Greenhouse Effect.
As some small furry creatures snooze through the winter in sealed burrows containing up to 13% exhaled CO2, and as little as 4% O2, how does the IR opacity of atmospheres enriched in CO2 by exhalation figure in the rate of heat loss in such environments as burrows, warrens, and for that matter, sleeping bags ?
Are any extant codes applicable to such problems ?
I would be interested in anyone’s insights into a (maybe) hypothetical situation:
According to Shakhova (March 2011), the methane coming from the East Siberian Arctic Shelf is about 8 million tons. For the purposes of simplicity I will round this up to 10 megatons. If we multiply this by the 105 times global warming potential of methane (Schindell 2006), we get about one gigaton of gwp from this source per year (compared to about 30 gigatons of CO2 from all direct human activity).
Reports earlier this year was that there was a ‘dramatic increase’ in methane release from the Arctic which prompted an sudden mission by US and Russian researchers to investigate put together ‘on short notice.’
My question is, if ‘dramatic’ here ends up being an increase of an order of magnitude, how long would it take for this forcing to significantly affect temperatures, particularly in the Northern Hemisphere?
The answer involves at least two considerations that I can identify:
How fast would the methane mix into the broader atmosphere?
How long would it take for the heat to build up that would be held in the troposphere by this new quantity?
(I suppose that whether this is a one-time emission or a new and increasing level of regular annual emissions from the Arctic would also have an effect on the answer.)
Thanks ahead of time for any insights or speculation.
Russel. Thinking about the CO2 column density within the burrow. Thirteen percent is roughly 300 times atmospheric density. So the animal to burrow wall is 10centimetrs, that means there is as much CO2 per unit area as roughly 30meters of free air. If we made our atmosphere have a uniform density it would be maybe 5kilometers deep, so we have less than 1% as much CO2 opacity within the burrow, as within the atmosphere. However, CO2 absorption lines are saturated near line centers, so we would still get some absorption/reradiation within the burrows air. Assuming the air in the burrow is warmer than the walls, there would be some insulative effect, although I don’t think it is too great.
With respect to #386 Unforced Variations: Oct 2011
I halved the concentration of water/sugar mix to less than 270 ppm. There is still a net increase temperature +0.1 C over plain water warm up. Done twice with exact weights of water set to 174 grams. This is as far as rudimentary equipment can go. Trying other solutions to mimic the greenhouse effect is OK as long as the experiment is repeatable, the solution must be the same before and after microwaving. David suggested unstable Carbonated water which changes in concentration with every heat spell.
For those following this little project, set room temperature to +23 C. Fever thermometer is ideal for this after a 20 second microwave. The water temperature should increase +16.5 C. Making the reading
same as human body temperature, with a maximum thermometer, which is a exactly what a medical thermometer is.
The idea of this is to prove physically to any contrarian who claim that trace elements don’t do much to the temperature record. Water is homogenous as opposed to the hydrostatic atmosphere, CO2 affects specific layers more than another as opposed to well mixed sugar with water. However, the similarities are meant to be as close as possible without going to a lab. The beauty of this experiment is that its repeatable in many ways.. You doubt trace elements have no effect with temperature ??? Try it out, learn for yourself.
Unfortunately, the current estimates of gas hydrate storage in the Arctic region are very poor and non-existent for Antarctica, and we don’t really know the sensitivity of hydrates to future warming. So, it is currently impossible to say with high confidence how methane responses will feedback onto future global warming.
The saturation is indeed interesting, because , ashibernating critters both radiate and absorb in the 10 micron thermal band, the gas trapped around them may add radiative thermal gain to the suppression of advection and convection by their fur and the air-trapping materials lining their nests.
I agree the line saturation is important , but it seems an open quantitative question , as the ratio of the thermal radiative gain from trapped CO2 to the conductive loss arising from the gas mixture depends empirically on the combined advective and convective “R” value afforded by nest materials as well as the animal’s fur.
I have a guy who’s been telling me because the accuracy of weather station thermometers is generally recorded in integer degrees that it’s not reasonable to express the global temperature to a precision greater an integer degree. To quote him:
“Worse, however, is the real life debate that “over a decade, the temperature will rise 1/10 of F”. If the data was collected with accuracy of +- 0.5F, then you can see the spread in your (modified) example far exceeds the stated conclusion. If we measure 63.5F on a certain occasion, should we become alarmed at an increase of 1/10th F? It is smaller than the standard deviation, and this becomes obvious if more accurate instrumentation reveals that the average is actually 63.7 F and not 63.1 F. In this hypothetical, we’re actually seeing temperatures drop. It might not fit the desired model, but that’s how it would play out and a faithful scientist would allow the facts to prevail, or would refrain from faulty or premature conclusions unsupported by the data or the precision of the data.”
In other words he’s saying it’s not reasonable to express the average global temperature with any decimal places on it because the measurements are only accurate to the degree.
I don’t have the statistical chops to refute this convincingly so I was hoping someone could help me with this. What can I tell this guy to convince him it’s reasonable to express the global temperature anomaly in tenths of a degree when the individual measurements are only accurate to 1 degree?
Remember that all models are wrong; the practical question
is how wrong do they have to be to not be useful.
— George E.P. Box & Norman R. Draper
EMPIRICAL MODEL-BUILDING AND RESPONSE SURFACES 74 (1987).
That’s a wake-up call, all right. And too close to home for this reader’s comfort. But, a pet peeve if I may? If the researchers want to be taken seriously in the drying Southeast Europe, they’d better update their maps. It’s been 20 years since Yugoslavia went the way of the dodo, and people in the successor states tend to have strong opinions about political borders.
Can someone provide me with the name of the purported law, which I have seen referenced in comment threads here, which states (to paraphrase) that no parody of fundamentalism can be so over-the-top that it will not be taken at face value by someone, unless it is accompanied by a disclaimer. Thanks
Hmm. Less-than-pleasant personal experience suggests that the ‘anti-greenhouse’ effects of exhaled water vapor on bag insulation outweigh anything the CO2 can do–even at levels well short of actual ‘saturation’. . .
[Response: Hence synthetic fill, hood drawstrings and semi-permeable membrane shells--Jim]
I would refer that person to the fact that the current long-term temperature rise is 0.3 degrees Fahrenheit per decade, not 0.1, that water vapor is an exponential function of temperature, and specifically on your question: that standard error is small through the use of multiple stations and multiple observations (divide by square root of number of observations).
Clearly your guy doesn’t have the statistical chops either.
Is he a sports fan? Maybe a fan of U.S. Football? Ask him, “What’s Adrian Peterson’ average yards-per-carry?” When he says “4.8″, ask him how that’s possible — given that the yardage on any single run is only recorded to the nearest whole yard.
Ask him “What was Ted Williams lifetime batting average?” When he says “.344″ ask him how that’s possible, given that hits are either zero or one — only recorded to the nearest whole hit.
And — don’t bother to explain anything else to him or argue further. Just let him chew on it.
Hi Dave (in comment #8), asking about accuracy of weather station thermometers, generally recorded in integer degrees, and implications for accuracy of a global temperature.
Your friend is wrong, of course; it is basic statistics that when you have a lot of measurements, the standard error generally scales down by the square root of the number of measurements. Lots of measurements together can therefore give much greater accuracy than any individual measurement.
There’s a minor technical point to watch. People don’t define an “average global temperature”. They define an “average global anomaly”. This is the average CHANGE in temperature. Simplified, you take a whole bunch of temperature readings, at many different locations. Then you do it again, several years later. At each location, you calculate the change. Then you average the changes. The physical reasons for this have been explained here at realclimate several years ago, I think, but I can’t find a link. (Actually, you calculate a “norm” at each location, giving average temperatures AT THAT LOCATION ONLY for any given month. Then you convert any reading to the “anomaly”, or the difference from the norm. Then you average anomalies over the whole globe.)
Here is a very simple example to show why your friend is incorrect. Suppose we have N pairs of numbers, x[i] and y[i]. Conceptually you can think of them as a reading of a temperature in two different years. Suppose there’s a normally distributed error in each value, with standard deviation S. The standard error in x[i]-y[i] is sqrt(2)*S.
Then add all the pairs of differences.
The standard error is now sqrt(2N)*S.
To get the average difference, you divide this sum by N. The standard error in your result is S*sqrt(2N)/N, or S*sqrt(2/N).
If N is around 20000, you get an error of S/100. That means the “global anomaly” in this simple case has two additional figures of accuracy over the individual measurements.
The actual global anomaly calculation is much more complex than this, and so too is the error analysis. The real problems show up with finding and addressing more systematic sources of error. A lot of work has been done on this; the most recent being the Berkley group (BEST), which with much fan fare got the same basic result everyone else has obtained before them. There’s some heavy statistical work going on to get an idea of the errors in calculated global anomaly figures.
Rounding errors on reading instruments are not really a problem, and although his objection sounds superficially plausible, its a failure to understand introductory high school level statistics.
I used to discuss with someone like this once. It sounds a lot like someone with initials GM. But in any case, you are most unlikely to be able to convince him to see the error. The best bet, in my experience, is to explain as simply and as clearly as possible for any onlookers the real mathematics of how rounding errors impact averages. In practice, the accuracy of global anomalies are much more strongly limited by systematic errors, rather than simple rounding errors. The final global anomaly, like pretty much any other average, is known to greater accuracy than any individual measurements.
Wayne, I sent you an e-mail to the address on your web-site. By the way, I used to know Andy Young who did the green flash page. I didn’t know he was still active. When I calculate the ppm for your recipe I get it coming in an order of magnitude lower. Your recipe is 0.1 grams of sugar in 174 grams of water.
The molecular wt. of sucrose according to wikipedia is 342 grams/mol. So .1 grams sucrose = 2.9 10^-4 mols sucrose. The molecular wt of water is 18g/mol so 174 grams H2O = 9.6 mols H2O. The ratio gives: 3 x 10^-5 molecules of sugar for each H2O molecule or 30 ppm. So this effect is due to a trace element that is present in concentrations a factor of 10 smaller than atmospheric CO2.
My crucet (my biggest pot) only holds 7 quarts so I added a teaspoon of sugar to 7 quarts of water. I filled another pot with as much water as it could hold.
Then I did 5 measurements each. For some reason the pure water pot stayed a little warmer than the other pot. I don’t know why. I filled identical plastic cups with water from a 1 cup measuring cup (spillage was probably my biggest variability). I measured the temp of the water in a cup then heated that cup for 30 seconds and measured the final Temp. Then I did that again and again for pure water and sugar water. I think it’s pretty convincing.
Oh yes. My microwave has a little circle indentation in the bottom in the back left corner just slightly larger than the cup I used. I placed the cup in the center of that indentation each time (this is essentially what Wayne suggested). This is important and likely another source of variability. You’re in the near-field inside a microwave oven so you should expect strong variations in the amount of heating you get as you move stuff around.
The ‘Suite101′ guy has taken one of the four pictures from the satellite information — he’s using Northern Hemisphere summertime, with forests in Siberia and Canada growing and soaking up CO2 — and pretended it’s the overall picture.
He links to a video at NHK that has the full set of four showing midway in the video.
Yesterday the GOSAT /IBUKI page had a tiny thumbnail image showing four different seasonal images, the same four you can see in the video linked from the ‘Suite101′ page. Today it has a single image.
Replication may now proceed with anyone who wants to try. I must thank Mr Pearson for an important job well done, will rewrite the recipe for those who don’t have a proper .1 gram capable scale. This experiment is similar to our atmosphere miniaturized in a plastic coffee cup..
Andy Young is surely very much the same great astronomer as he ever was, just a little older.
Re: JAXA, GOSAT, annual carbon cycle, and that utter moron O’Sullivan cherry-picking the NH growing season, and (to boot) suggesting that revisions to our understanding of regional CO2 fluxes somehow call into question the warming effect of CO2…
Anyway, apparently the GOSAT folks have some newsworthy findings. Would be nice to see the actual paper. The JAXA site indeed refers to a paper published Oct 29, but that is this paper (Takagi et al.) — which is relevant, but what it discusses is the uncertainty reduction achieved, and that’s what the color codings on their map represent, not the CO2 fluxes themselves. That paper does not show the four quarterly maps shown on the television news story that O’Sullivan garbled.
[Response: Correct, thank you. It's all about the ability of the GOSAT spectrometer to reduce the uncertainty in the surface-based flux measurements. Basically, it's most useful where those measurements are the sparsest. Possibly more on this later if time allows.--Jim]
I used two different cups and two different measuring cups to avoid transferring sugar water in the non-sugar water and vice versa. Wayne suggested using a turkey baster to transfer the water but I couldn’t find ours. I am pretty sure that would’ve reduced my variability a lot. I wasn’t super careful in pouring the water from the measuring cup into the plastic cup and always a little water probably spilled back into the pot. This set of 10 measurements took a surprisingly long time to perform (about an hour). In retrospect it would’ve been worth the small additional effort to be more careful when transferring the water to the cup in order to reduce variability from that source. It was late and I was tired when I started. Occasionally I discovered that I’d forgotten to take the initial temperature etc which contributed to the time it took. I think this would be an ideal experiment to perform with a child of the appropriate age, say 8. I’m not sure why the sugar free water stayed cooler than the sugar water. It might be that it simply came out of the tap cooler. I thought I had let it run long enough that I was getting the steady-state temperature but maybe not. I did fill the tap water pot last.
Comment by John E. Pearson — 1 Nov 2011 @ 11:59 AM
The report from the Nobel Prize-winning Intergovernmental Panel on Climate Change will be issued in a few weeks, after a meeting in Uganda. It says there is at least a two-in-three probability that weather extremes have already worsened because of man-made greenhouse gases.
Get the gate-du-jour debunking bookmarks ready. “Himalayagate” will probably increase tenfold on Google in 3 weeks ;)
The IR bands of water vapor presumably contribute to reducing radiative transfer as well.
There was a polar bear rug in my grandmother’s house, that, though I was spared infant photography , I occasionally slept on as a child. As I was on the outboard side of the hide, it proved surprisingly chilly with a bare floor beneath.
I’m interested in the big picture of scientific methodology, particularly when it comes to climate models and computer simulations.
I had this nice picture of what we were doing with data assimilation when it comes to computer models and data retrievals. The picture went something like this: the reason we simulate is to help us get better measurements, and the reason why we measure is to make better simulations. We are learning about the state of the climate in one case (data retrieval) and we are learning about the causal structure of the climate in another (simulation).
My reasoning was that in the simulation context, data assimilation provides us with a nice way of constraining the physics of our model. At every analysis point (depending on the method of assimilation you are using), the trajectory of the model would be adjusted to fit actual observations in that area, thus preventing drift and providing accurate initial conditions for the next analysis interval. The expected values given the model trajectory are weighted against the observational values to provide us with the most stastically likely value given our background knowledge.
In turn the results of our model are then used to produce better measurements, particularly when it comes to data retrievals from satellites. In those data retrievals that employ inverse theory, we use model outputs to set our a priori values for the retrieval, and then compare the values provided by our spectral analysis to the background. Again we get the statistically most likely values.
This picture was however complicated when, speaking with an expert in atmospheric physics, I was told that modelers frequently won’t use the retrieval data in models because they don’t trust the error statistics, and thus do not want to introduce correlations between the assimilated data and their background assumptions (since the mathematics of the model assumes they are not correlated).
How much does this complicate my big picture? It does seem like the better our models, the better our retrieval data will be. Since this represents a large portion of our data globally – we are very dependent on our models for this kind of measurement. But it might seem that our models are only coarsely constrained by our satellite observations (more so with in situ observations).
Does this sound right? How do scientists see these practices (and others) fitting together to create a rational form of investigation?
Comment by JustaPhilosopher — 1 Nov 2011 @ 2:12 PM
The link to the John O’Sullivan article at suite101 now brings one to the website’s main page. The article is no longer listed among his works there too; the last article they have of his now is from May, 2011. That’s odd.
Twitter posters are reporting that the court has granted the motion for climate scientist Michael Mann to intervene in the UVA email case. I don’t know if this claim is true, but the hearing was today.
Tuning isn’t exactly the same thing as data assimilation (at least, I don’t think of them as the same, someone can correct me). I think of tuning as the comparison of model output to known data sets – and changing the parameters/physics in the model to ensure that the output matches the data set. The model, autonomously, will hit the data points.
Data assimilation is something different. Its a system that runs alongside the physics. It uses statistics to compare the modeled values to observations – and weighs them against each other to determine a likely value somewhere within the expected region. This likely value serves as the initial condition for the next time step in the simulation. The model is forced to take up values close to the observations (in some sense).
That used to be true, but recently 3dvar and 4dvar data assimilation systems have been incorporated into global climate models.
[Response: Not when doing standard climate model runs. If you want to run a climate model in weather forecasting mode for debugging purposes, that makes sense, but you are doing a weather forecast, not a climate prediction/projection. - gavin]
Comment by JustaPhilosopher — 1 Nov 2011 @ 4:30 PM
Chris Colose said at #5 “Unfortunately, the current estimates of gas hydrate storage in the Arctic region are very poor and non-existent for Antarctica, and we don’t really know the sensitivity of hydrates to future warming.”
Thanks. Do you have a reference for the most recent and most reliable estimates for Arctic hydrate storage?
“So, it is currently impossible to say with high confidence how methane responses will feedback onto future global warming.”
Because of this uncertainty, has it been completely left out of all models?
(Thanks ahead of time for any light anyone can cast in my generally dim direction’-)
There has been some blog excitement recently about modellers rejecting 20th Century hindcast runs that they don’t like. As far as I can work out it appears to stem from this paragraph in Gent et al. 2011:
Next, a twentieth-century run from 1850 to 2005 is completed, and a decision on whether this is acceptable is made based almost exclusively on two comparisons against observations. They are the globally averaged surface temperature against the historical reconstruction and the September Arctic sea ice extent from 1979 to 2005 against the satellite era observations. This second comparison is the reason that the sea ice albedos are allowed to change after the components are coupled. If these comparisons are deemed unacceptable, then the process of setting up the preindustrial control run would be repeated.
Is this a common practice across all modelling groups? What might constitute an unacceptable run, and is there nothing that could be learned from those runs?
[Response: No this is not common practice. GISS for instance assesses the models based on the pre-industrial control, and all the 20th C transient simulations have been submitted to CMIP5. - gavin]
I should clarify for everyone else that Gent et al. 2011 is the model description paper for NCAR’s CCSM4 GCM (more acronym soup).
Could there be a good justification for rejecting certain model runs? It sounds like a practice that would leave the way open for confirmation bias but then it’s not made clear why there is an a priori belief that some runs might be unacceptable or what criteria are used for rejection.
The estimates are rather wide…for example Brook et al 2008 (in Abrupt Climate Change Final Report) suggested there is ~7.5 to 400 Gt C stored as methane in Arctic permafrost, a rather large uncertainty, and the sensitivity to warming is not clear either. There are also a lot of hydrates that are located at depths in soils and ocean sediments where anthropogenic warming will take place over millennia rather than decades.
The Shakhova paper is important for understanding the atmospheric methane budget, but it is a negligible contribution to the total global emissions, and the importance of permafrost seems far secondary to that of wetlands, agriculture, animal husbandry, etc. Furthermore, measuring emissions is not the same thing as *changing emissions,* much less attributing those changes to recent warming.
Other people (gavin for example) will have better insight as to whether the latest generation of models going into the AR5 have interactive permafrost and methane responses (there’s been a lot of development in dynamic vegetation and carbon cycle modeling, and without knowing the details I will still say I don’t really think any methane feedbacks should be believable at this point); this has generally been neglected to date, or at least is only the focus of specialized modeling studies (see e.g., Archer, 2004) without much meaning for AR4 21st century projections. But these generally agree that there is no evidence for any “catastrophic” methane threat in the near future. Warming of the deep ocean waters might be important for these sorts of feedbacks on multi-millennial timescales, and there’s been suggestions of the relevance of all this at the PETM (~ 55 million years ago) but this is a science still in its infancy.
NOAA’s global weather forecast systems have not used satellite retrievals for data assimilation for over a decade. Instead they use the raw satellite radiances and perform the forward model (atmospheric temperature to infrared radiances at top of atmosphere) and its adjoint during the convergence iteration steps of the data assimilation. So for weather models, satellite radiances are certainly used to great effect, as evidenced by the skill in the Southern Hemisphere now being nearly as good as in the Northern Hemisphere where there is much more conventional data, but inverse model satellite retrievals are not. See Derber, J. C. and W.-S. Wu, 1998: The use of TOVS cloud-cleared radiances in the NCEP SSI analysis system. Mon. Wea. Rev., 126, 2287 – 2299.
“… the end-Permian extinction was a physiological crisis, selecting against genera with poorly buffered respiratory physiology and calcareous shells. Genera with unbuffered physiology also fared poorly in the Guadalupian extinction, consistent with recognition of a pronounced crisis only among protists and reef-builders and implying similar respiratory physiological stresses. Despite sharing a similar trigger, the end-Permian extinction was considerably more severe than the Guadalupian or other Phanerozoic physiological crises. Its magnitude may have resulted from a larger environmental perturbation, although the combination of warming, hypercapnia, ocean acidification, and hypoxia during the end-Permian extinction likely exacerbated the crisis because of the multiplicative effects of those stresses. Although ocean carbon cycle and evolutionary changes have reduced the sensitivity of modern ecosystems to physiological stresses, extant marine invertebrates face the same synergistic effects of multiple stressors that were so severe during the end-Permian extinction.”
They seem to be describing the process they used for tuning the two parameters (sea ice albedo and RH threshold for low cloud formation) coupled mode to get acceptable reference runs. Doesn’t seem like a scandal, more like a normal part of model development.
Comment by Rattus Norvegicus — 1 Nov 2011 @ 8:58 PM
Apologies but I’m not sure how exactly to bring the recent lecture (October 31st) by Matthew Ridley at the University of Edinburgh. It actually is not entirely on the side of the AGW proponents but argues a logical case and uses appropriate referencing. It also spends some time discussing confirmation bias. I’m sure their are points raised that you will be able to confirm or refute. Although I’m fairly sure you will have read it but I draw your attention to it on on the slim off chance you have not. It is often said that scientists don’t really know how to communicate with non-scientists. Matt Ridley is a scientist that certainly does
“The ‘Suite101′ guy is none other than John O’Sullivan, serial science distorter.” – 27
John O’Sullivan CBE (born April 25, 1942) is a leading British conservative political commentator and journalist and currently Vice President and executive editor of Radio Free Europe/Radio Liberty. During the 1980s he was a senior policywriter and speechwriter in 10 Downing Street for Margaret Thatcher when she was British prime minister, and remains close to Thatcher until this day.
O’Sullivan is Editor-at-Large of the opinion magazine National Review and a Senior Fellow at the Hudson Institute. Prior to this, he was the Editor-in-Chief of United Press International, Editor-in-Chief of the international affairs magazine, The National Interest, and a Special Adviser to British Prime Minister Margaret Thatcher. He was made a Commander of the British Empire (CBE) in the 1991 New Year’s Honours List.
He is the founder and co-chairman of the New Atlantic Initiative, an international organization dedicated to reinvigorating and expanding the Atlantic community of democracies. The organization was created at the Congress of Prague in May 1996 by President Václav Havel and Lady Margaret Thatcher.
He is known for O’Sullivan’s First Law (a.k.a. O’Sullivan’s Law), paraphrased by George Will as stating that any institution that is not libertarian and classically liberal will, over time, become collectivist and statist.
O’Sullivan currently resides in Decatur, Alabama with his wife Melissa and stepdaughters – Katherine, who is 23, and Amanda, 17.
Comment by vendicar decarian — 1 Nov 2011 @ 9:20 PM
Ian @ 49, thanks for the tip, is there a link? Matt Ridley is not a scientist. He is a writer and conservative businessman associated with the right wing anti-science group that pretentiously calls itself the Global Warming Policy Foundation.
David, greatest care possible must be given to achieve exact mass of water, the bigger the size of the flask the likeliest the flasks have different weight and volume as well.. Tap water is also not preferred. Good for you for trying! .. I suggest repeating your samples after cool down to room temp. This would be revealing.
Gavin or a physical chemist, what would be the carbon (# 22) 30 ppm mole ratio of sugar mixed in water given that CO2 has 11 carbon atoms less than sucrose? … I dont think 30 ppm sucrose in water is the same as 360 ppm CO2 in air, although it would be nice… Thanks in advance.. Need it for recipe introduction.
Thank you to the folks who answered my question. I now have some solid stuff to tell him, particularly about standard error. And I’m kind of embarrassed that I didn’t come up with the batting average example on my own.
Spencer’s Discover page seems to be unreliable, even compared to UAH’s own published satellite temperature timeseries (here: http://vortex.nsstc.uah.edu/data/msu/t2lt/tltglhmam_5.4). The good Dr. does not entertain correspondence on the matter, but it presumably reflects some processing deficiency in the real-time product.
For the record, the UAH LT timeseries currently plots slightly above trend (and very close to GISTEMP when both are re-based to the 1979-2008 means). Available UAH monthlies for this year are:
No, the O’Sullivan concerned above would presumably be the Slaying the Sky Dragon author O’Sullivan (b. 1961). The ex-Thatcher-advisor, RFE/RL editor and Hudson Institute fellow O’Sullivan (b. 1942) is a different person, despite vaguely related leanings on climate issues.
There is a story in this, though; the younger John O’Sullivan appears to be padding his resume like crazy, including passing himself off as a National Review columnist on the strength of his elder namesake’s publications. Intelligent conservatives might think that bad form. Personally, as a Buffy fan, I find his pretensions to ‘Slayer’ status even funnier.
“I should clarify for everyone else that Gent et al. 2011 is the model description paper for NCAR’s CCSM4 GCM (more acronym soup).”
Note that they used this procedure only with the 2deg version of CCSM4, which wasn’t used the CMIP5 experiments (and Gent et al. 2011 say they did it “once early on”, which suggests they noticed a large model drift). Even the NCAR doesn’t have enough computational resources to repeat multi-century runs just to find the ones that appear “acceptable”!
Which of Mr Ridley’s points did you find particularly cogent? Addressing all of them would take a long time (and they have been addressed, ad nauseam).
Pete #51, “Matt Ridley is not a scientist, he is a writer”,
True, and it shows in the Galileo/hockey shtick talking points of the Edinburgh talk that Ian (#49) brought up (Bishop Hill has text). But he is a writer of some fairly bestselling popular science books, with a D. Phil. in zoology. Climate change is way out of his usual beat (animal and human behavior), and his understanding of the issue appears to be largely based on a book by some accountant (Montford), based in turn on the blog of a mining consultant (McIntyre). Still, he might be mistaken for someone who knows what he’s talking about, and cheap as his rhetoric is, I imagine it may be effective with audiences.
I write to announce my employment with my publishers, Suite101 was terminated today without prior notice or explanation and all my articles published over a two-year period with them are now removed from the Internet. [...]
He believes this may be in retaliation to his latest post about the “satellite data that contradicts carbon dioxide climate theory”. I would say, a spectacular bit of personal insight. In the blog reply CM mentions O’Sullivan also adds:
Perhaps a Japanese translation of the heading to the graphic would throw some light on this matter.
But perhaps also some verification of what a map says before claiming anything would avoid the need to throw any light in the first place. In any case all the heading states is a date in Japanese format: “Heisei era, 21 July (summer)”, Heisei being the current era.
Thanks Chris and Hank for your comments and links.
Chris wrote: “Warming of the deep ocean waters might be important for these sorts of feedbacks on multi-millennial timescales”
I realize that warming of the deep oceans would take millennia–though I (oddly?) worry about these time scales, too.
But my more immediate concern is for the vast shallow ocean areas such as the East Siberian Arctic Shelf, some 200 million k^2 wide and averaging only 50 m in depth. How much methane is down there and how easily could the warming and increasingly ice-free surface waters melt the frozen methane hydrates on the ocean floor, which, as I understand, cap a much larger reservoir of free methane gas, and which Shakhova has said is destabilizing. Even if it is a relatively minor contributor at this point, how rapidly might feedbacks in these shallow waters amplify?
And to return to the original hypothetical (I hope) question:
If such destabilization does happen at high rates, say an increase by ten fold, how long would it take for the gas to work its way to the rest of the global (or at least Northern Hemisphere) atmosphere, and then how long would it take for this added forcing to make itself fully realized in global temperature changes?
If that is too much to ask, perhaps someone could help me with the broader question:
How long does it take global temperatures to ‘catch up’ to the forcing of ghgs? Are we seeing now the full forcing of CO2 (and other ghgs) emitted up to the point of five years ago? Ten? Twenty?…
> CM says: 2 Nov 2011 at 5:25 AM
> Vendicar #50, re: John O’Sullivan,
> No, the O’Sullivan concerned above would presumably be the
> Slaying the Sky Dragon author O’Sullivan (b. 1961). The …
> O’Sullivan (b. 1942) is a different person, despite vaguely
> related leanings on climate issues.
Folks, watch ‘Vendicar’ closely. He can’t be ignored, he’s a mixer. Posting a location and family details of “O’Sullivan” in #50 was bad. Posting the wrong family and location is abysmal. VD creates trouble for other people.
A lot of people worry about the very distant timescales, and in fact was the subject of a recent NAS report. It is a very good read and gives a comprehenisive review of the various timescales relevant for future global change.
Solomon S, Battisti D, Doney S, Hayhoe K, Held I, Lettenmaier D, Lobell D, Matthews D, Pierrehumbert RT, Raphael M, Richels R, Root T, Steffen K, Tebaldi C and Yohe G 2010: Climate Stabilization Targets: Emissions, Concentrations and Impacts over Decades to Millennia. National Academy Press:Washington 190pp. Publisher Link http://www.nap.edu/catalog.php?record_id=12877
I would read the Brook et al paper I cited in my last comment for a review of methane storage understanding, but my take is that your questions are too uncertain at this point for useful answers. As for timescales, once methane is injected above a shallow ground layer, I see no reason why it can’t make it to the atmosphere rapidly (not necessarily true for deep ocean hydrates), and the methane lifetime in the atmosphere is on the order of ~10 years, which is itself slower than the interhemispheric mixing time (allowing CH4 to be rather well-mixed globally).
As far as global temperatures go, temperatures respond rapidly to increases in greenhouse gases but the system does not relax to a equilibrium point for at least decades, and with some sluggishness on the timescales of centuries. The equilibrium timescale itself also depends on what the “true” climate sensitivity is, which itself is uncertain.
Everyday Fractals vs Extreme Red Herrings and the Climate of the Past
What’s happening? The old climate system is gone. All current weather is produced by the current climate system. The current climate system features much more ocean heat content, warmer sea surface temperatures, warmer lakes, expanded Hadley circulation, less Arctic ice cover, and bunched precipitation (rain and snow more compacted into shorter events). Drought and flood with bunched precipitation in between times and places overall amount to bunched precipitation on a global scale with almost a fractal pattern. The focus on extreme events alone rather than the global pattern is a distraction and something of a red herring isn’t it?
Now consider the statement “The chances are two out of three that climate change has already caused more extreme events.” Only two out of three? Estimate this: what is the chance that an average year of decade under our current climate system would not have more flooding and drought than the same length period on average during 1951-1980? Pending a long model run, if “slim to none” does not come to mind hit me with the clue stick quickly!
V.D. who posted #50 is wrong. He has the wrong O’Sullivan. The denialist O’Sullivan is nothing to do with Radio Free Europe/Radio Liberty. John O’Sullivan the denialist is a nitwit who (like Lord Monckton and Pat Michaels) has appeared on the Kremlin’s English-language satellite channel “Russia Today.” People who work for RFE/RL tend not to appear on Kremlin mouthpieces. The denialist O’Sullivan seems to be British. He’s some nobody who just makes stuff up.
RFE/RL are very informed journalists and would never have someone like the denialist John O’Sullivan.
RFE/RL writes about climate change and how it affects Russia and the former Soviet Union. Sometimes I give them little tips about climate change issues in Russia.
I got them to write about some Russian “scholar” named Andrei Areshev who claimed that American climate scientists were beaming secret climate weapons at “some countries” (ie Russia) and CAUSING global warming. That happened during the fires.
Areshev claimed that American scientists are manipulating the climate, but the Russian Academy of Sciences is having a conference right now that seems to be exploring the idea of putting aerosols in the sky to cool the planet. They seem to be looking at the possibility of creating a little nuclear winter.
A really ignorant FBI white paper is giving credibility to a KGB defector who claimed in a book called Comrade J that nuclear winter was a KGB hoax. I told an FBI guy that I am having a little spat with that if the Russians didn’t believe in nuclear winter, they wouldn’t be trying to make one.
Usually I don’t criticise the FBI, but they are being pigheaded.
Realclimate should really cover this Russian Academy of Sciences conference. Hopefully the Russians will decide not to put aerosols in the atmosphere to cool the planet. For one thing, it won’t fix the acidification of the ocean. Alan Robock is going to the conference. Hopefully he will convince the Russians that making a little nuclear winter is probably not a solution to global warming. His talk is called “Smoke and Mirrors.”
Within an hour or so of reading the latest comments on this thread, I notice that Matthew Ridley’s lecture and John O’Sullivan’s misrepresentation of the Japanese satellite project are being vociferously copied-and-pasted by deniers on blogs everywhere as the latest evidence that the great global warming hoax has been exposed.
Referring to Dave Werth’s post about temperature being measured in integral degrees. The statistics of this have been explained by several posters, and are familiar to those of us who actually do measurements. My concern with this comment is the fact that an adult could conceive that an entire community of scientists could miss something so obvious. This expresses a contempt for expertise that is absolutely amazing. Also, it expresses a remarkable intellectual laziness; it would be very easy for this person to look into the way averages affect the errors of measurement. One might actually get to the point of exploring systematic errors, and how standards are maintained in a technological society. If one takes the time to dig a bit, one sees that there are lots of extremely honest people doing all this work, and these people have a level of competence in their disciplines that should be remarkable to the average non-scientist.
I’ve just read the text of the lecture. For me the most important part is that Matthew Ridley is underestimating climate sensitivity and overstating uncertainties in water vapour and cloud feedback, while conflating these issues with (what I consider to be) baseless concerns about very high sensitivity. The Hockey Stick debate is tired and boring, and concentration on the Hockey Stick was not a factor that persuaded me that AGW is likely to be a serious threat.
He is correct that nobody knows for sure whether we’re headed for a disaster if we continue to emit CO2. And he is correct that some overstate the risks. However he is not correct to say it is only models that give cause for concern. Since ditching my scepticism some years back I’ve been left with reading the science as a hobby. I’ve centred my reading on the Arctic as it’s the most exciting area of the science. The potential threats from the Arctic alone are:
*Methane release from permafrost and ocean clathrates.
*Increased mass loss from Greenland’s northern flanks
*Rapid changes in weather and atmosphere/ocean circulation (i.e. climate shifts) due to loss of Arctic sea-ice.
All of these matters are real issues as shown by science based on observations.
If people chose to take the risk and do nothing about CO2 emissions that is their choice. People are free to vote and spend as they see fit (virtually the only countries making what paltry efforts have been made are free-market democracies).
However in making such decisions one must bear in mind the consquences of being wrong. Those who chose to take the risk of doing nothing will have to live with their decision if they are wrong, as indeed will we all.
All I’m saying is, personally I chose not to take the risk.
There is another point from the lecture that’s worth noting at this point – the unfounded alarmism of suggesting we’ll jeopardise our economy over action against CO2 emissions. There is no evidence that any government would seriously consider doing that – it would be political suicide.
Chris R wrote re: Matthew Ridley’s lecture: “He is correct that nobody knows for sure whether we’re headed for a disaster if we continue to emit CO2.”
No, Ridley is not correct. We are already experiencing destructive and costly disasters caused by the warming that has already occurred from the CO2 that we have already emitted.
And this is exactly why the denialist propaganda machine is going into overdrive to denounce, ridicule and attack anyone who talks about the connection between AGW and the ongoing escalation of unprecedented “weather of mass destruction”.
Chris R wrote: “And he is correct that some overstate the risks.”
Who are these “some” and exactly how are they “overstating” the risks?
And for that matter, considering that the plausible risks of continuing on our present course of ever-increasing GHG emissions include mass extinction of most life on Earth, how exactly can the risks even be “overstated”?
As far as I can tell, Ridley is just another denier spouting whatever talking points he can scrape up to argue against doing anything to reduce fossil fuel consumption. Given his track record in the British banking industry, I can’t imagine why anyone would listen to anything the man has to say.
I’ve skimmed over the PDF file of Matt Ridley’s talk only briefly, concentrating primarily on the physical science points of his presentation. He is very confused, and virtually everything in that section was wrong (which is presumably why the WUWT crowd is eating it up).
Just a few of the many problems
- He confuses the equilibrium climate response with the (expected) transient climate response to the current increase in CO2.
- He improperly dismisses the role of aerosols in offsetting some of the greenhouse warming. He states that the NH warming faster than the SH is evidence against aerosol impacts, since aerosols are located moreso in the North. There are many other factors at play in the N-S asymmetry however, primarily due to the different land-ocean area contrast between hemispheres.
- The rest is full of irrelevant distraction arguments. Climategate, the hockey stick, the irresponsibility of Pachauri on the glacier issue, etc are all interesting to some people, and the skeptical concerns about them have varying degrees of merit, but atmospheric physics does not care about any of this.
Climate sensitivity depends on a lot of things, but whether or not e-mails were hacked into is almost certainly not one of them
[Response: ok, that's it for this topic--no further posts on Matt Ridley, etc. -moderator]
Chris R. says, “If people chose to take the risk and do nothing about CO2 emissions that is their choice. People are free to vote and spend as they see fit (virtually the only countries making what paltry efforts have been made are free-market democracies).”
Let us say, Chris that you own a gas station, and as you are pumping gas, I choose to light up a cigarette. Or let us say that we are in a flimsy rubber lifeboat on the open sea, and I choose to light a fire to warm myself. These actions are my choice to make, after all. Why should you object?
The climate doesn’t care who screws up. The CO2 could be from the US or China or Timbuktou. Our progeny will all suffer just the same.
Harvey @ #73,
The guy is a true believer. Among other things when I mentioned the Isthmus of Panama is around 3 million years old he said geologists have no clue about how old it is and radiometric dating was bogus. I didn’t bother arguing with him about that but the precision of statistics thing was so basic that I wanted to give him a solid answer. I haven’t heard back from him yet.
Are you being your usual condescending self, or do you believe that stratospheric influences on the entire troposphere have greatly increased in the past month pushing temperatures lower than previously observed?
#61, doskonaleszare – ‘Note that they used this procedure only with the 2deg version…’
Ah, yes, you’re right. I hadn’t connected those two dots before.
I didn’t think NCAR would be repeatedly performing runs for weeks or months on end until they found one they were happy with, which is why I’m not sure about Rattus’ description of it as a tuning exercise.
Still curious what would be deemed unacceptable though. The paper suggests they were prepared to reject a 1deg run if it was unacceptable, but it just didn’t happen.
RE: “[Response: People are more than welcome to discuss the effect of climate on crop yields in the new open thread as long as they actually stick to the topic, provide legitimate support for their statements that others can check, and steer clear of insults. Believe me, I'm as interested in the topic as anyone.--Jim]”
A climate ‘debate’ without the insults? Certainly an interesting theory, Jim. Shall we test it?
Earlier it was noted that the BEST study was showing about (arguably, to a degree – literally) 2C warming since the 1810′s. Whether it’s 2C or less (or more) doesn’t really matter. There has been (arguably) unprecedented climate change. I was asking whether or not now is seen as ‘better’ climate-wise than the early 1800′s because it’s warmer now. This lead to ‘who decides what’s better’ and from there the ‘debate’ began on whether or not (weather or not?) climate change has already had – and continues to have – an adverse affect on crop yields. Specifically corn yields in U.S. and Canada.
I posted a link to the U.S. Dept of Agriculture’s web site. I chose the parameters of Total Bushels and Bu/Acre. I chose only those 2 so that historical/yearly production could easily be seen and thereby increases and decreases in yields just as easily seen and compared. (One can chose from quite a wide range of stats; to include all or just one of particular interest.) Looking at that data (1860′s – 2011) it clearly shows a steady and significant increase in yearly corn yields. No question. No debate. There is no other way to look at the data or interpret that data. Corn yields have increased steadily and significantly. And climate change is apparently to blame.
What I mean is; there is no evidence that climate change has been a ‘bad thing’ as far as U.S. corn production goes (any other crop?). Why are there still those that will insist that not only has it been a bad thing, it has been a ‘really’ bad thing and include the inevitable ‘it’s worse than we thot’ scenarios.
Here’s a link to the USDA. This one shows yields (Bu/acre) over acres planted. You can see how total acres planted have gone down slightly since the 20′s but the yields have dramatically increased.
True or false: Climate change has been, thus far, beneficial to corn production in the U.S.
[Response: Fair enough, and appreciate you outlining how the discussion proceeded from your vantage point. Yields of most major agronomic crops have increased greatly over the last century, no question, and corn is certainly one of the textbook cases. Because corn is a C4 grass, it tolerates high temperatures better than those that are not (e.g. barley, wheat, oats). However, the mistake you are making is in attributing the increases to climate change alone. It is very well known that much of this increase is due to breeding (the development of hybrid corn lines in particular), and management (fertilization and irrigation in particular). These increases cannot be continuously linear indefinitely, because you will eventually run into a biological yield ceiling (asymptote) which makes each increment of increase harder to achieve than the previous one. You can't just look at what has happened to date and extrapolate it into the future; that's a big mistake in a biological system. That was Ray's point. Also, in order to defend your viewpoint, you will have to provide evidence that climatic changes have been the dominant driver of increases in US corn production--Jim.]
Your analysis can be extended to most other crops. Thus far, everything has been working to increase yields; longer crowing season (less frosts), greater precipitation, and higher atmospheric CO2 concentrations. The only natural ingredient missing was fertilization, which farmers generously applied.
Similar results have been found for Aspen trees in the experiment in Rhinelander, WI.
[Response: No, your statement is highly simplistic. You've ignored the enormous impact of breeding and biotech for one. And not all climatic changes have been beneficial. Don't make statements without backing them up--I'll delete them.--Jim]
Barn E. Rubble,
To answer your question we would have to assess whether other parameters have remained the same–ceteris paribus. I think you will find the ceteris ain’t paribus. I would also recommend that we look at those areas where the effects of climate change have been notable–e.g. in TX.
barn E. rubble: “Looking at that data (1860′s – 2011) it clearly shows a steady and significant increase in yearly corn yields … Corn yields have increased steadily and significantly. And climate change is apparently to blame.”
Right. Because NO other factors except climate change have had ANY impact on corn yields during the last 150 years. Increased irrigation, increased fertilizer use, plant breeding, and mechanization have NOTHING to do with it.
barn E. rubble wrote: “there is no evidence that climate change has been a ‘bad thing’ as far as U.S. corn production goes (any other crop?)”
Any other crop? It takes really determined willful ignorance to be unaware of the impact of the Texas drought and the midwestern floods on agricultural production.
It’s really hard to refrain from “insults” when
[edit: I know it's hard. Refrain anyway]
[Response: Dan, the subject at hand is how climate changes (and by extension of discussion, other growth factors), have influenced crop yields, with particular emphasis on corn. You linked to an article about the effects of CO2 fert and ozone on aspen in a controlled experiment in Wisconsin. They're not related. Please stick specifically to the topic of the moment, backed with relevant evidence. CO2 fertilization is a topic I know something about, but it's not what we are discussing here.--Jim]
You contend that: ‘We are already experiencing destructive and costly disasters caused by the warming that has already occurred from the CO2 that we have already emitted.’
From my reading of the science and dabbling with tools like NCEP/NCAR reanalysis I think the following. It was only in the mid 1970s that the CO2 driven warming rose from natural variability to become the dominant factor in the progression of global ave temp (GAT). Since then there has been some indication of an increase in weather disasters, although reporting biasses remain a problem with this. In some regions (e.g. the Mediterranean and Australia’s Murray Darling Basin) the pattern of impacts from AGW is already rising from the envelope of natural variability. However I do not see a global pattern that is strong enough to call a direct link to temperatures and thus conclude that we are seeing the start of a persistent trend of adverse impacts (changes due to modes such as the NAO/AO/PNA/ENSO may not sustain under continued AGW).
That is my view. I am not in denial about AGW I just don’t think the evidence is as strong as you claim – not yet. To be clear I expect the signal of adverse AGW caused weather disasters to continue to rise above the noise of past and current weather. But I am far from convinced that this is happening now to a degree that it is overwhelmingly convincing.
No need to state the bleedin’ obvious. I am aware that we’re all in it together. The fact is that individuals are free to make choices, it is by making such choices that change will be effected by impacting the political and economic worlds. Do you have another way of making the necessary changes?
ChrisR, other specific events have been linked to global CC as being very unlikely to have occurred without this forcing: Arctic melt, European deadly heat wave of ’03, last years heat wave and fires in Russia. Do you have reason to say these were not made more likely by GW? How many of these attributions does it take before you would consider that a global effect on weather is underway?
Arctic Melt? Click on my name, read my blog. The changes in the Arctic is most of the climate science I read and post on. The Arctic is a special case – regional factors are massively amplifying AGW.
For a start: Can you cite research that shows the 2003 or 2010 European Heatwaves were outside of the envelope of natural variability? Perhaps you can educate me.
The question seems to me to be whether there is a worldwide pattern of extreme weather impacts that are outside the range of pre-AGW variability. In the case of the post 1975 warming – that is clearly outside of natural variability and has no other explanation than human causation – on that issue case closed. Those I consider to be denialists are the ones who play the boring game of ‘it’s not warming’ ‘OK it is, but not so much’ ‘ and anyway it’s because of the sun/natural cycles/whaddeva delete as appropriate.
I do not consider this to be the case for weather impacts globally. The case is not as strong as is the one for the post 1975 global warming. Even in the case of impacts I have listed they are still only just becoming exceptional and the main reason for considering AGW is the factors involved in those weather events. e.g. Australia’s Big Dry.
The problem with your demand for me to refute weather events as having a AGW cause is three-fold. Firstly for most of these events we hear – “the most severe in 50 years”, “the wettest in 100 years” implying that well before the AGW signal emerged from natural variability in the global average temperature similar events were happening. Secondly all weather now has a component of AGW, even the unremarkable. This is because the impacts of AGW, from Thermo/Mesosphere cooling down to surface warming and increase in atmospheric humidity, are present factors. Thirdly; it is for those who are making an assertion to support it with evidence and argument, not to demand I refute it a-priori.
Taking the opposite approach, do you have any reason to believe that they were made more likely? These types of events have occurred throughout history. What makes you think they are more widespread now?
Today’s TNYT has an article on the (under reported) flooding in Cambodia, actually much more serios than in neiboering (and richer) Thailand. The flooding in Cambodia is reported as greater than ever in living memory.
That’s not a very precise estimate of the recurrence time. Does anybody have a decent estimate?
Comment by David B. Benson — 3 Nov 2011 @ 10:36 PM
Not really, but the Guardian says “Like other countries in the region, Cambodia has been hit by heavy monsoon rains that have overwhelmed swollen rivers, dams and canals, causing the worst flooding in decades.”
There is little documentation about the extent of flooding in this area. However, I was able to uncover this report from Thailand. Drought and floodign in Thailand is highly tied to ENSO events (Droughts during EL Ninos and flooding during La Ninas), and experienced an increase in droughts and decrease in floods after the PDO shift in 1976.
The recent kerfuffle about AMO somehow got me thinking about the small difference in trends between surface and TLT records. Since it appears to be a mainstream position that some of the recent warming may have been driven by multidecadal ocean circulation changes it occurred to me that there may be a spatial difference in response between the near-surface and lower troposphere.
To make it a more succint question: Has there been anything published on spatial atmospheric temperature responses to ocean circulation changes, such as the thermohaline circulation? I’m thinking something along the lines of Figure 9.1 in AR4.
I’m confused as to what you think the alternatives are! If you can’t persuade people to change their behaviour, with perhaps some help from increasing fuel prices (demand destruction), what else is there? I see no public appetite for a ‘war footing’ response, virtually everyone I know drives and flies, their response to rising fuel prices is anger – not a feeling that it’s for the best.
FWIW I think that if we crack AGW it will be because of new technology, in particular, success at implementation of fusion.
Chris R “if we crack AGW it will be because of new technology…”
…… or maybe extensive deployment will make current technology really, seriously cheap.
Where I live, roof renovation companies are offering solar PV as giveaway incentives for clients replacing/ repairing roofs. (Not so long ago, they were offering big plasma screens. A vast improvement.)
How much more will it rain? A starter is to see mhow much more water vapor might be in the air. In the spirit of Ray Pierrehumbert’s “Principles of Planetary Climate” wherein subsection 6.3.2 is an appropriate place to begin, let us first consider the http://en.wikipedia.org/wiki/Clausius%E2%80%93Clapeyron_relation
to use the August-Roche-Magnus approximation for the saturation pressure, sat(T), which is of the form
sat(T) = c.exp[-k(1/T - 1/T0)]
for constants c and k and where T0 is a convenient reference temperature. Think of T0 being about (273+15) kelvin. The temperature T is only slightly different from T0,
T = T0 + dT
for some increment in temperature on the order of 1–3 K. Then the exponent is
-k(-dT/[(T0+dT)T0]) ~ (k/T0^2)dT
demonstrating an (approximate) exponential increase with a dT increase in temperature; the various resulting constants are given explicitly by the August-Roche-Magnus approximation on the Wikipedia page for T0 = 273 K.
There certainly has been a great quantity of rain in the past couple of years; enough so that the sea level has detectably dropped. I read today with sadness about the flash flooding in Genova, Italy, of which I have fond memories from the previous century.
Chris R., you seem to think technology just “happens”. I can assure you from personal experience that this is not true. Humans have to make it happen. It takes investment and it takes time. Reduced consumption is not “the answer”. It is essential, but only because it buys time–time, I would note, that must be bought now because we did nothing for 20 years, despite the near certainty that such a new energy infrastructure would be essential for the continuance of human civilization.
Keep in mind that Clausius-Clapeyron gives only a constraint on the amount of water vapor that can build up in the atmosphere for a given temperature, before it begins to condense. It is a poor indicator of the way evaporation (and thus precipitation) changes in a warming climate, which depends primarily on constraints imposed by the energy balance at the surface (it can also be thought of in terms of the tropospheric energy budget, which is between latent heat release and net radiative cooling– there are some various camps of thought on this). Global precipitation actually goes up much slower than Clausius-Clapeyron, and is actually a rather useless variable due to the large spatial heterogeneity in where places get wetter/drier.
Chris Colose @114 — Thanks and I suppose I should have written more qualifications. Professor Pierrehumbert’s section 6.4 does well in explaining the total energy balance but its good to know there are ‘various camps’ on this. Later in “Principles of Planetary Climate” there is mention that GCMs don’t give water vapor increasing as fast as CC implies but Professor Pierrehumbert still uses a percentage increase, i.e., still exponential with temperature.
Now, of course, one has to go an actually measure the precipitation in each locality. Various studies have been performed, the satellite data being especially useful. The studies I located certainly indicate increased precipitation at mid and high latitudes over the periods of those studies; obviously total precipitation has gone way up in the last two years. What remains a puzzle for me is that the same satellite data, globally averaged up to about 2008 shows no statistically significant trend. Puttin these studies together ipso facto rain in the tropics, globally averaged, has declined.
Despite the imperfections of attempting to derive trendfs precipitation from first principles (and so leaving lots out), I still think it fair to state that in localities which are not in the process of drying up, precipitation will increase approximately exponentially with increasing temperature.
This has an important implication for the use of rain intensity statistics in forecasting extreme events in each separate catchment. If you have a suggestion which you think superior to an exponential growth in rain intensity please do make and defend it. To be as clear as possible, I am only interested in the shape of the function for temperature dependence with the thought of using the statistics for each rain gauge or catchment to determine the constants for that locality.
Comment by David B. Benson — 4 Nov 2011 @ 10:36 PM
#113 Ray Ladbury,
I note you haven’t answered my question about your proposed solutions…
I work in electronics – I am all too aware that technology doesn’t just happen, as a metrology technician I support the people who make it happen. You may notice that in the Keystone XL thread I have linked to a paper by Armour & Roe about committed warming(which I have previously read) and some interesting slides by Roe (which I came across before posting) – I am all too aware of the time issue, but don’t see any easy answers to this.
Without long-distance transmission (e.g. from North Africa / Mediterranean) PV isn’t as attractive here in the UK and in the rest of Northern Europe. Local solutons are part of the answer, but I think if fusion can be cracked there is the prospect of undermining the cost benefits of fossil fuels.
David B. Benson,
I too watched the news footage of a square in Genoa (BBC) crammed with smashed vehicles. When I see such news I can’t help but wonder about AGW and if I’m seeing an AGW related disaster. But the sceptic in me demands more than suspicion before drawing a link I’ll argue for.
Assuming I can get past the evil ReCaptcha this time…
I could use help with a radiative-convective model. I keep getting a too-hot stratosphere. I’m using conventional composition for the stratosphere, plus the ozone profile from the 1976 US Standard Atmosphere. I’m taking O3 absorption coefficients from Chou et al. and O2 from Ditchburn 1962 and Inn & Tanaka 1953. I don’t understand what I’m missing.
Figures for 20 levels of atmosphere and the ground follow. Top is layer 1, bottom (ground) is 21.
Chris R @116 — What is the recurrence time for such rainfall events in Ligure? Just in Genova? Has that recurrence time become less in the past 50 years? If the answer is yes, then one can fairly attribute the shorter recurrence time to global warming; no individual rain events can be so attributed, of course.
The amount of damage to lives and property is disproportionately large as people, almost everywhere, fail to give proper respect for rare extreme events and build in the flood plain. I beleive that is/was the case in Genova.
But, wili, that’s a headline. Headline writers, well, tsk. You know. I didn’t follow up that 2010 article to see if it got clarified; did you?
The article itself:
“… Dr. Rob Carver’s analysis of the statistical likelihood of the Moscow heatwave:
Now, let’s take a look at July 29, when it cracked 100 F at Moscow Shermetyevo. By our records, the reported high was 37.78 deg C and the normal high for that day is 20.0 deg C. According to GDAS, the maximum temperature for Moscow on July 29 was 35.85 deg C and the normal temperature according to CFSR is 20.6. Using the techniques of Hart and Grumm (2001), the climatological anomaly for maximum temperature is 3.9 deg C.
So, using the GDAS and CFSR data, the normalized anomaly of maximum temperature was +3.1. That’s near a recurrence interval of once per thousand years which matches the quotes I’ve heard from Russian met agencies. Now, if we assume the climatological anomaly derived from CFSR data is the same of the observations, the normalized anomaly jumps to +4.5, which translates into “less than once every 15,788″.
That however, is a tricky assumption to make. We know that the climatic properties of CFSR and GDAS data have to have some correspondence with what’s actually happens in the atmosphere, otherwise weather models wouldn’t work. What becomes difficult to quantify (in the time constraints of writing for the public) is how the statistics of the climate properties line up between observations and reanalysis. And at these extremes, it doesn’t take much change in the average and standard deviation of a property to dramatically change how unusual an event is. Another possible source of error is the assumption that the climatology of CFSR is the climatology of the operational GDAS. Which is not a slam-dunk since NCEP shifted to a higher-resolution model on July 28. Now, I don’t have any information to say the post July 28 GDAS data has different climatological characteristics, but it’s a possibility. Another big assumption I make is that daily maximum temperatures follow a Gaussian (normal) distribution and that from 30 years of CFSR data, I can adequately characterize such a distribution….”
More reasonable, eh? Because you can get more out of the text than the headline writer can put into the headline.
Chris R., There are no solutions at present. None. Merely reducing consumption, as I said, is not a solution, but merely an exercise in buying time. It is by no means certain that we can even find solutions in time to avert severe consequences. The fact that the problem is difficult makes it all the more criminal that fossil fuel interests have successfully prevented society from even acknowledging the problem, let alone working toward solutions.
It is precisely factors such as building on flood plains and reporting issues that leave me with doubt about many such events.
There is other research for the UK (sorry but as my time is limited I tend to read locally on these issues). I read a paper from the Met Office (UK) last year on increasing contribution of heavy precipitation – which I can’t find at present. However there is another paper:
Evidence for trends in heavy rainfall events over the UK.
Osborn & Hulme. 2002. PDF
The sign of the change is consistent with the simulated
climate-change signal due to increasing greenhouse gas
concentrations, though the magnitude of the observed
change is greater than that expected from the model
…We conclude, therefore, that there is some evidence
that the recent winter changes have some component of
a climate-change signal within them, but the evidence
is not sufficiently strong on its own to suggest that
the observed trends will continue with the same
magnitude into the future.
There may be no solutions at present, but we can still do the right thing: reducing our personal emissions, arguing in favour of what the science really shows. I can only think of one hopeful precedent: Slavery was once a major de-facto energy source, it was abolished in the British Empire in 1833, despite the cost implications to business. That was done because it became widely recognised that slavery was wrong.
Chris R @123 — Yes, finding statistical evidence for an increase in actual extreme rainfall events, as opposed to flood damage change, is quite difficult. However, the satellite evidence demonstrates increased (average) precipitation at mid and high latitudes; if the proportion which are extreme events does not change one suspects an increase in the occurance of rainfall events beyond design basis.
Chris R., There are interesting parallels to your arguments regarding abolition. Slavery was as wrong in the mid 18th century as it was in the 19th, and there was no shortage of decent people arguing against it either. One of the things that made it possible to abolish slavery was the fact that technology advanced to the point where it was not economical. It was mainly in the American South, where wealth was concentrated in slaves where abolition was seen as economic suicide.
I suspect that were we to develop a new infrastructure capable of perpetuating human civilization without fossil fuels that suddenly you’d find a lot more people who would identify minimizing climate change as a moral issue. It is an unfortunate characteristic of human beings that economics often subverts our moral sense.
Human contribution to more-intense precipitation extremes
Seung-Ki Min, Xuebin Zhang, Francis W. Zwiers & Gabriele C. Hegerl 2011
Extremes of weather and climate can have devastating effects on human society and the environment1,2. Understanding past changes in the characteristics of such events, including recent increases in the intensity of heavy precipitation events over a large part of the Northern Hemisphere land area3–5, is critical for reliable projections of future changes. Given that atmospheric water-holding capacity is expected to increase roughly exponentially with temperature—and that atmospheric water content is increasing in accord with this theoretical expectation6–11—it has been suggested that human influenced global warming may be partly responsible for increases in heavy precipitation3,5,7. Because of the limited availability of daily observations, however, most previous studies have examined only the potential detectability of changes in extreme precipitation through model–model comparisons12–15. Here we show that
human-induced increases in greenhouse gases have contributed to
the observed intensification of heavy precipitation events found
over approximately two-thirds of data-covered parts of Northern
Hemisphere land areas. These results are based on a comparison of
observed and multi-model simulated changes in extreme precipitation over the latter half of the twentieth century analysed with an optimal fingerprinting technique. Changes in extreme precipitation projected by models, and thus the impacts of future changes in extreme precipitation, may be underestimated because models seem to underestimate the observed increase in heavy precipitation with warming 16.
Hank, few can match your web search skills. I was just attempting to answer ChrisR at #97 “Can you cite research that shows the 2003 or 2010 European Heatwaves were outside of the envelope of natural variability?”
I’m not sure “envelope of natural variability” is really well defined enough to warrant an answer, but are you saying these events were well within the range of what would be expected without climate change?
wili @127 — I better understand (although not well) the Eastern Europe/Central Asia heat wave of 2010. That event, barring any climate change, appears to have a return time (period, recurrance interval) in excess of 1000 years; a purely statistical concept. So it might have happened anyway, just quite rare on statistical grounds. But with global warming? Well, Stfan has provided not one but two recent threads here on Real Climate to aid us in understanding that, yes, with climate change it is/was more likely (although still quite rare).
Comment by David B. Benson — 6 Nov 2011 @ 10:22 PM
Cross posted to WUWT tips, 12:36 am EST Nov 7, 2011.
I need some help here. In particular, I’m looking for people who have at least an undergraduate level of training in thermodynamics generally and energy flow specifically, or self taught equivalent. (I have studied energy transfer in a number of university courses, but that was many years ago now. My work often involves heat transfer calculations. That is I get paid to get this stuff right (and sued if I get it wrong). I’m pretty confident in my ability to perform an energy balance, but I’m stumped by this one).
Here, in a nutshell, is my conundrum:
Why is the ocean so cold?
Our friends at BEST, CRU, GISS, etc. going all the way back to Fourier, all agree that the “surface temperature” of the earth is more than 10 celcius or 283 kelvin. Further, it is hypothesized that the “surface temperature” of the earth has always been at least 10 celcius and often more.
So how is it that the temperature of the ocean, which is in direct contact with this “surface” is at least 6 degrees colder than the “surface temperature”? Not only is the ocean in direct contact with this “surface”, but the earth itself is constantly shedding thermal energy into the ocean from the crust.
If someone can show me a complete energy balance that allows the ocean to be at a steady state temperature that is lower than the “surface temperature”, I would be grateful. If there isn’t such a balance, one of two things must be true. The “surface temperature” is colder than estimated or the laws of thermodynamics don’t apply to the oceans.
Just to be clear, it is an absolute certainty that the laws of thermodynamics apply to the oceans.
No hand waving allowed. I’ve seen a number of debating point style arguments. I would like to see some math on this. I’m working on my math on this. My first run approximation has the oceans boiling away a few billion years ago, so something is not right. If you are not sure how the oceans should have boiled away billions of years ago, add 0.1 W/m² of energy to a 4 km column of water for 1 billion years and determine what the temperature of the water should be. The 0.1 is lower than the approximation of the rate of energy transfer from the crust to the bottom of the ocean.
[Response: The answer can be found in any basic text book in oceanography and has been known since the days of Challenger expedition - the ocean is stratified by density. Colder water at a fixed salinity is more dense than warmer water. Therefore cooling of the surface (during winter, in high latitudes) is far more effective at producing denser water than warming the surface (at least at surface salinities experienced today). - gavin]
Your response amounts to hand waving. Can you provide a reference to one of these texts on oceanography that shows the energy balance with completed math?
[Response: The only to have 'complete math' is to use a full ocean model with high resolution inputs of atmospheric forcing fields and time and space resolution to calculate the seasonality, ocean surface mixing, advection etc. Your best bet would be something like the ECCO project results. If you want something less than the 'completed math', you need to be more specific about the level of approximation required. Perhaps if you could specify what you find unsatisfying in the basic explanation, that would help. - gavin]
ChrisR, wili, re: “envelope of natural variability”
Until the end of the 20th century (20C), maximum seasonal temperatures across Europe mostly ranged 2 to 3 SDs of their 1970–1999 climatology, with regional extreme summers clustering in a few decades of the last five centuries. During the 2001–2010 decade, 500-year-long records were likely broken over ~65% of Europe, including eastern Europe (2010), southwestern-central Europe (2003), the Balkans (2007), and Turkey (2001). These summers have considerably contributed to the upper tail of the European distribution of summer maxima (…). Thus, the percentage of European regions with seasonal maxima above 3 SDs (>99th percentile of the 1970–1999 distribution) has doubled within one decade.
(Barriopedro et al., 2011)
Excellent, thanks. The paper makes a good case that taken together the 2003/10 heatwaves in Europe were highly anomalous and outside the envelope of natural variability. Figure 3 is staggering.
I’ve given up on weather disaster stats as the best source seems to be EMDAT and that’s too subject to reporting bias to be highly persuasive, although it remains intriguing.
One minor issue – your link dead-ends anyone without a Nature subscription. The paper is Barriopedro 2011, “The Hot Summer of 2010: Redrawing the Temperature Record Map of Europe.” And a paywall free copy is available here.
The world is likely to build so many new fossil-fuelled power stations, energy-guzzling factories and inefficient buildings in the next five years that it will become impossible to hold global warming to safe levels, and the last chance of combating dangerous climate change will be “lost for ever”, according to the most thorough analysis yet of world energy infrastructure.
Anything built from now on that produces carbon will continue to do so for decades to come, and this “lock-in” effect will be the single factor most likely to produce irreversible climate change, the world’s foremost authority on energy economics has found. If this infrastructure is not rapidly changed within the next five years, the results are likely to be disastrous.
“The door is closing,” Fatih Birol, chief economist at the International Energy Agency, told the Guardian. “I am very worried – if we don’t change direction now on how we use energy, we will end up beyond what scientists tell us is the minimum [for safety]. The door will be closed forever.”
CM, Chris, thanks–that sounds like a very interesting paper.
On another topic, a milestone: my precis of Gwynne Dyer’s “Climate Wars” is just hitting its 1000th page view (thanks to a sudden surge of interest of unknown origin.) Thanks to RC readers for past support. . .
[Response: More nonsense. Mis-application of spectral calculations, post-hoc justifications for ridiculous physical mechanisms, huge overstatements of their importance: JASTP takes another step towards astrology. - gavin]
More nonsense. Mis-application of spectral calculations, post-hoc justifications for ridiculous physical mechanisms, huge overstatements of their importance: JASTP takes another step towards astrology.
I’m wondering if Roger actually reads the stuff that he cites these days, or just sees stuff that “enlarges the debate” (i.e. differs from the mainstream, no matter how crazy) and instantly posts it to his blog. His fawning over a crappy blog post by Bob Tisdale is further evidence of his descent into irrelevancy:
Think about the basic methods of heat transfer–conduction, convection and radiation. Convection is the most efficient, but requires actual physical mixing–and that ain’t happening. Conduction requires contact. Again, contrary to your assertion, there is no contact betweenthe deep oceans and the “surface”. Rather, there is a temperature gradient and limited heat diffusion–just as across an insulator. And there is no radiation between the deep oceans and the surface.
Radiation is also one of the reasons why the surface is warmer–just as a thermometer in the sun registers warmer than one in the shade nearby. I wouldn’t give up on thermo just yet.
from “The King of Human Error | Business | Vanity Fair”
The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts. Kahneman and Tversky called this logical error the “conjunction fallacy.”
Their work intersected with economics in the early 1970s when Tversky handed Kahneman a paper on the psychological assumptions of economic theory. As Kahneman recalled:
> I can still recite its first sentence: “The agent of economic theory
> is rational, selfish, and his tastes do not change.”
> I was astonished. My economic colleagues worked in the building
> next door, but I had not appreciated the profound difference between
> our intellectual worlds. To a psychologist, it is self-evident that
> people are neither fully rational nor completely selfish, and that
> their tastes are anything but stable.
The paper that resulted five years later, the abovementioned “Prospect Theory,” not only proved that one of the central premises of economics was seriously flawed—-the so-called utility theory, “based on elementary rules (axioms) of rationality”—-but also spawned a sub-field of economics known as behavioral economics.
Mis-application of spectral calculations……another step towards astrology. – gavin
I am too doubtful about Scafetta’s results. I have reconstructed natural oscillations superimposed on the 350 year long CET trend of 0.25C/century. http://www.vukcevic.talktalk.net/CET-NV.htm
There is no 60 year periodicity.
“If you are not sure how the oceans should have boiled away billions of years ago, add 0.1 W/m² of energy to a 4 km column of water for 1 billion years and determine what the temperature of the water should be.” John Eggert — 7 Nov 2011 @ 12:36 AM
You didn’t include the W/m^2 radiated away in the polar regions, and the global THC that moves heat from the tropics(where the energy balance between insolation and infrared radiation is positive) to the poles(where the outbound IR carries away more energy than the incoming solar energy)
“The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts. Kahneman and Tversky called this logical error the “conjunction fallacy.”
A flaw taken advantage of by the fields of advertising, propaganda, infotainment, and some religion; facilitated no doubt by repetition and availability heuristics. So the human mind, social and physical, is apparently prone to entropy and intractable messes, another reminder of how critical it is to communicate effectively, continuously, and energetically.
How much will it rain, a correction. From section 6.8 in Ray Pierrehumbert’s “Principles of Planetary Climate” we have the effects of the boundary lawyer upon latent heat (and so the associated water vapor). For temperature ranges of current interest the dependence seems to be a low order polynomial rather than exponential. Even though the book states slowly varying that is, IMO, only in comparison with an (approximately) exponential function; we can be sure the temperature dependence is greater than linear.
However, I had thought I understood the final figure of the SI for
Seung-Ki Min, Xuebin Zhang, Francis W. Zwiers & Gabriele C. Hegerl
Human contribution to more-intense precipitation extremes
NATURE VOL 470, 17 FEBRUARY 2011, 378–381 http://www.nature.com/uidfinder/10.1038/nature09763
but I understand now that I don’t understand.
Comment by David B. Benson — 11 Nov 2011 @ 9:50 PM
I have looked into this, analysing number indicators considered as acceptable and widely available data: the sunspot record, Ap index, the Arctic’s magnetic field differential and the McCracken’s data for the strength of magnetosphere at the Earth’s orbit, no evidence was found for consistent 60 year cycle.
@ Thomas Lee Elifritz — 12 Nov 2011 @ 11:09 AM re draining of Lake Agassiz into Lake Superior.
IMHO, it’s not a crisis in geophysics. The key quote from Colman et al is “The Thunder Bay area contains none of the features observed further north in Lake Superior, which have been interpreted as flood-related from earlier seismic surveys.”
Which means that Lake Agassiz did drain through the St. Lawrence to the North Atlantic, just not through Thunder Bay.
Plug the coordinates 49.04957, -88.34038 into maps.google.com, go to terrain view and zoom out – there are a lot of features that look erosional on a large scale. Also see “this.”
The Thunder Bay area contains none of the features observed further north in Lake Superior, which have been interpreted as flood-related from earlier seismic surveys.
Right, but they describe Nipigon floods as “post Marquette” as does everyone else. We know there has been lots of flooding there. The ‘Nipigon phase’ was much later than the Younger Dryas, when the ice sheet had clearly receded to that area. What is not clear is the extent of ice sheet decay at that time.
For water to have flowed from Lake Agassiz at the YD interface into Lake Superior, either there must have been a serious disruption of the ice sheet at that location, or there must have been a massive subglacial flow from the Lake Kelvin (Nipigon) basin as in a sub glacial lake discharge, literally cutting a channel under and through the ice. Somebody has to admit that the water must have flowed through the Nipigon feeder networks at 13 ka, and nobody seems to want to touch that claim. Did water flow through there at 13 ka, or not? I don’t see any way around this. It looks like a crisis to me.
A minor one, I’ll grant that. The data seems to disagree with the consensus.
I am just curious how one can look at the aggregate of the evidence and not be very concerned about the likely effects of climate change–particularly in a world that will have ~10 billion people and likely no source of cheap energy, fertilizer or organic chemical feedstock.
That one can accept that greenhouse gasses are responsible for ~33 degrees C of warming and think you could double CO2 in the atmosphere without significant warming simply defies logic–as well as evidence.
Just a note here to say that there has been an increasing spam bombardment over the last week or so. This necessitates bulk comment deletion without wading through them to pick out the legitimate ones (as I try to do).
So, if you have any uncertainty about where your comment may have landed, probably best to send again. And of course, always execute a quick “copy” command before sending anything that you spent much time on. And we know that everyone spends *much* time on their comments.
110 116 Chris R: “FWIW I think that if we crack AGW it will be because of new technology, in particular, success at implementation of fusion.” See: http://thinkprogress.org/romm/2011/11/08/363268/the-republican-brain-science-mooney
Chris Mooney’s new book: “The Republican Brain”
“Market forces won’t allow that” means “We want to keep the status quo.”
Any other excuse means “We want to keep the status quo.”
No matter what alternative you propose, the answer is “We want to keep the status quo.” But the answer is always in some sort of “code” [doubletalk] that makes the answer sound reasonable.
Thomas Lee Elifritz @162 — Sorry I didn’t keep the link to the geology paper; the evidence looks fairly solid and there is exactly zero evidence of Proglacial Lake Agassiz draining eastward in the Moorhead phase. Further, there is a geology paper (which I linked towards the very end of the comments on the second-but-last Clovis Comet thread here on Real Climate) that the Proglacial lakes along the Laurentide Ice Sheet well to the east catastropically drained down the Mohawk (eventually to the Atlantic). And guess what? This was almost at the onset of Younger Dryas. [I take that ass more important than the IMO overrated drainage of Proglacial Lake Agassiz.
The paper you cited by Rayburn et al. appears to indicate that Proglacial Lake Vermont also drained at about the same time, but the claim that Proglacial Lake Agassiz drained that way flies in the face of abundant evidence from modern Lake SUperior east. Well, well. So another large lake drains into the Atlantic. Addin those to Baltic Ice Lake I also draining into the Atlantic around the onset of Younger Dryas and quite the salinity crisis happened.
That makes four proglacial lake (systems) all draining simultaneously. Given that all weere poised to drain I suppose that is not entirely coincidental. But it does provide some modicum of weak support for the Clovis Comet hypothesis.
Comment by David B. Benson — 13 Nov 2011 @ 1:39 AM
#162, Thomas Lee Elifritz,
Your comments remind me of a long-held suspicion of mine.
It’s been years since I read on the subject of the Younger Dryas. When I last did I found myself persuaded by Wunsch – that changes in the THC were not responsible for DO events, and that events such as those and the YD could be explained by changes in wind fields. i.e. the consensus view has the tail wagging the dog, whereas it was changes in wind fields modified by the advance/retreat of ice sheets that impacted ocean currents, not the ocean currents impacting the atmosphere.
That’s pretty scary, I guess I can’t comment on that at all.
changes in the THC were not responsible for DO events, and that events such as those and the YD could be explained by changes in wind fields
That’s more reasonable, but certainly fresh water forcing and ice fields and atmospheric gas changes played a role on the general climatic response to the changes brought on by the slow changes in solar insolation and isostatic rebound. Many large proglacial lakes were draining in many different places at this time, and I’m not oversimplifying the problem by making demanding statements, I’m merely interested in the Lake Agassiz eastern flow and/or flooding part of the problem. You can see the mess of things it made up there, sediments made it all the way into Lake Michigan. The question remains, Nipigon at 13 ka, or not?
The history of the idea has been pretty well documented; it’s been one of those very attractive ideas that people find innately beieveable; some simply assert it must be so, others have tried to find evidence for it.
People get caught up on ideas and _believe_. You know other examples.
Many people take issue with the sparse carbon and beryllium isotope dating of heavily reworked and widely transported artifacts supplied by the Fisher and Lowell camp. Even the dating on the Athabasca and McKenzie river channels has been questioned. I am not claiming there was no NW drainage of Glacial Lake Agassiz at 13 ka, although that too is certainly a reasonable debate to have. Even the Champlain Sea sequences have not been resolved enough for a definitive chronology yet, and the Nipigon problem is similar. These conflicts have simply reached the level of a minor scientific crisis.
This quite current main blog post seems to touch on all the main points, citing sources. A discussion worth having once in one place, rather than repetitions scattered over many different blogs simultaneously, I suggest.
I’ve more or less abandoned the YD impact hypothesis, although I’m still waiting on microlithology on samples and possibly a lake coring from the area. My interest is purely paleohydraulic at this point, including and especially ‘something considerably more subtle and cryptic’ i.e. – the Younger Dryas. Including the referenced geomorphism, the last remaining contentious issue with the eastern flows is, did they occur at 13 ka, and if they did, then where? I am left with evidence clearly pointing to anomalous flooding through Nipigon, where we now have an anomalous geomorphic feature.
The geomorphism itself can be discussed over at the blog posting you cite. Melosh’s work concerns itself mostly with head on classical impacts on airless worlds, where with the proposed YD impact would be a volatile rich oblique impact on a several kilometer thick ice sheet, creating a lateral subglacial blast wave. In that case the actual crater would be the much smaller 10 kilometer depression just to the west of Black Sturgeon Lake.
Since most if not all concerned claimed that never happened, I am thus left looking for alternative explanations of the paleohydraulic history of the Nipigon flood channels, particularly with regard to proposed YD 13 ka flows, for which there is mounting evidence that flooding actually did occur. It may be possible to refute that scenario as well, and I’m open to that too.
It’s just a minor crisis, I’m sure all those involved will get it sorted out.
jch – I just went to your graph page and changed the start date for the third series to 2001 – WOW, what a difference. I suppose ten years is considered a short time interval since we are now told that 30 years is the minimum for climatologists to see any significance.
So where is Ira Glickstein? Is he too busy fighting off the snarky comments from skeptics at WUWT?
Poor Ira posted a story on WUWT telling all the skeptics what a fun time he was having commenting here at RC & now he’s getting a right telling off, some more polite than others.
Interesting to read the likes of Willis Eschenbach saying “The appearance of any serious skeptics there (at RC) gives them credibility.” and “So let me go on record as saying I won’t be a useful idiot and participate in the RealClimate farce.” I suppose this is true – in my experience it is the exception to find an idiot useful.
For “John” (and Ira G. if he comes back)
— there’s a high-school-level explanation available at Grumbine’s blog.
You’re confused (“thirty years” or “ten years”) because you’re reading about different data sets. The numbers are different, so the calculations, done the same way to figure how many years/observations are needed, do come out with different answers.
It’s a pretty simple exercise. You can do the arithmetic to convince yourself that the numbers come out as people say (or that they don’t).
(and the links posted earlier may also help you decide who’s trying to fool you and who’s trying to help you do the arithmetic — hint, if people pick only a short time span to make a claim, be skeptical; if they use all the available information and show you their work — be skeptical, and check the arithmetic, or as R. Reagan put it, “trust, but verify.”)
And by the way, John (and Ira G.) — Robert Grumbine sums it up and offers you what you need to check this for yourself; links in several of those posts I gave you above. For example:
this will give you a decent start in reviewing arguments.
Trend analysis must be done over a long enough period
All relevant data should be considered.
If this results in a period shorter than normally considered long enough, a specific and strong explanation must be made
Trends cannot be computed by picking a single year to start from and a single end year and then ignore all years in between
Average error is not sufficient. Squared errors should be used, or some other method which avoids the problems that ‘average error’ has.
Regarding that last point, I’ll suggest some looking at the numbers yourself. I’ve made up a spreadsheet available in both Excel and OpenOffice formats for you ….
the long-term, net impact of aerosols on cloud height and thickness, and the resultant changes in precipitation frequency and intensity
Ten years will give you data on many, many, many cloud events so it’s not at all surprising that a shorter timespan is sufficient for them to come up with statistically significant results. They’re not just looking at weather rather than climate, they’re looking at impacts on individual clouds and drawing trends from that.
In contrast, detecting a climate trend takes a longer period of time because the trend is relatively small in magnitude compared to the natural variability in the system. If the trend were (say) a 20C rise per decade rather than 0.2C you probably wouldn’t need 30 years to tease out the trend from the noise … but it’s not :)
So, as Hank has said, different data sets (derived from different phenomena) require different amounts of data for useful statistical analysis.
John didn’t understand the basic info about statistics.
Once he’s read Grumbine he will understand that the calculation — how long a period is needed to detect a trend (how many observations, and how variable the data are) — comes out different if you start with different numbers taken from different data sets.
“On two occasions I have been asked, ‘Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?’ I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.” -Charles Babbage
Ray Ladbury @160
I’ve found Ida G. for you. It seems he’s still at WUWT saying he tried to post here at RC but it hasn’t emerged onto the thread. He assumes it hasn’t escaped from the clutches of Mr Moderation. He says he’ll try again but in the meantime he’s pasted the comment up at WUWT, all 780 words of it.
An interesting number that, as according to those 780 words Ida is not concerned by 780ppm of CO2 (ie 2x390ppm) because it would take until 2200 to reach that level at present rates (or 2100 at double the rate) and this doubling of CO2 would only increase global temperatures by 4.5 – 2 deg C (IPCC) or 1 – 0.5 deg C (other experts), of course plus or minus those natural cycles etc. In fact Ira is only worried about running out of good home-produced fossil fuels which would put honest folk like himself at the mercy of foreign scoundrels.
Of course Ida used more words to say all that.
Folk may think it’s a bit of a cheek me paraphrasing Ida like this. Then he did paste me onto WUWT without a by-your-leave.
Could the RealClimate moderators confirm whether or not Ira Glickstein had a comment blocked on this thread on 13/11/11?
I appreciate Ira Glickstein’s motives for commenting at RC, as evinced by his post at WUWT, were/are less than genuine (which maybe part-justifies the post author editing Ida’s comment text). I appreciate that RealClimate is a commentary on climate science and not a commentary on climate scepticism. I apprecaite the moderator’s job is pretty much a thankless one.
Yet RealClimate is being accused of blocking a “respectful’” & “serious” comment that was invited by RC readership.
So was Ira Glickstein blocked on this thread on 13/11/11? Or did his comment disappear into a months-old obsolete thread as Pete Dunkelberg @183 noticed happened to Ida’s 11/11/11 comment?
[Response: Don't know but I told everyone a couple of days ago that I'm not fishing stuff out of the spam folder anymore, because there are too many messages in there to wade through. It's very likely that at least some legitimate messages have been deleted recently. On top of that though, he needs to learn that we're not here to answer the same old questions like he asks, that have been asked a million times before, and/or do peoples' basic homework for them. It's tiresome--Jim]
what do you guys make of this? this, from the abstract, is surprising:
“On the whole, the most scientifically literate and numerate subjects were slightly less likely, not more, to see climate change as a serious threat than the least scientifically literate and numerate ones.”
[Response: It would depend on what exactly people thought. If there is a group of less literate people who think that climate change means the end of world sometime next Thursday, while the more literate people are following the IPCC/mainstream thinking, you could conclude the same. This is part of it, but the other part is related to the polarization in the US on this issue - many educated and literate people on one particular side of the public debate are unfortunately taking their cues on this from their leadership, and spend time looking for reasons to suspect the mainstream conclusions and this is not balanced by literate people on the other side educating themselves on why the mainstream is very likely right. Note too that this is only a conclusion for the wider public; among the scientific community, the appreciation of the severity of the problem increases dramatically as you know more about the science. - gavin]
David Benson @ 185, this is the same study that was mentioned by John @ 174. These types of precipitation changes are predicted and observed, but the actual events exceed predictions, see for instance Allen and Soden 2006 Atmospheric Warming and the Amplification of Precipitation Extremes, although I think the same thing had been noted much earlier. So now we have the explanation.
Comment by Pete Dunkelberg — 15 Nov 2011 @ 7:46 AM
Walter Cain (#189) wrote: “this, from the abstract, is surprising …”
I would suggest that the very next sentence in the abstract makes the study less surprising:
“More importantly, greater scientific literacy and numeracy were associated with greater cultural polarization: Respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased.” [Emphasis added.]
Which raises the question, just what sort of “values” would “predispose” a “scientifically literate” individual to “dismiss” the overwhelming scientific evidence for ongoing, dangerous climate change, as represented by virtually all of the peer-reviewed literature in the field, the views of virtually all publishing scientists in the field, and the public statements of virtually every national and international scientific organization that has anything to do with climatology?
I would suggest that this is the direct result of a years-long propaganda campaign that has not only fed its audience a diet of pseudoscience (and pseudo-skepticism), but has also conflated that pseudoscience with relentless vilification of climate scientists as enemies of its audience’s ideological “values” (eg. liberty, capitalism, etc).
Comment by SecularAnimist — 15 Nov 2011 @ 11:51 AM
yeah, the question is whether or not you’re “concerned” about climate change – not whether it’s a real phenomenon or problem. in the paper they use the terms “Hierarchical Individualists” (e.g., republicans) and “Egalitarian Communitarians” (democrats). i’ve read somewhere that the vast majority of scientists are democrats.
In that study, the criteria for “scientific literacy” were very low. I would contend that their “scientifically literate” contingent would simply be the sort of education one would get with a priveleged education in a Western democracy. It would not be sufficient to immunize them to the Dunning-Kruger effect.
I would also note that the test they gave to determine concentrated on an understanding of scientific facts–and not on the ability to use or assess scientific validity.
The overwhelming preponderance of estimates–and of evidence–favors a sensitivity of ~3 degrees per doubling–and if this is wrong–there is far more probability above than below this value. People need to look at ALL the evidence, not merely grasp a few studies as a drowning man grasps at straws.
“Daniel Yergin was recently interviewed on NPR’s always informative Planet Money podcast. Yergin—most famous for his 1992 Pulitzer-winning opus on 20th century petroleum development, The Prize—has penned a sequel …, The Quest is a look at those who might have to clean up the whole mess. ‘The heroes are the engineers and scientists of the energy world — the geeks, in other words.’”
Anyone able to comment on these extraordinary suggestions?:
“We highlight the existence of an intriguing and to date unreported relationship between the surface area of the South Atlantic Anomaly (SAA) of the geomagnetic field and the current trend in global sea level rise. These two geophysical variables have been growing coherently during the last three centuries, thus strongly suggesting a causal relationship supported by some statistical tests. The monotonic increase of the SAA surface area since 1600 may have been associated with an increased inflow of radiation energy through the inner Van Allen belt with a consequent warming of the Earth’s atmosphere and finally global sea level rise. An alternative suggestive and original explanation is also offered, in which pressure changes at the core-mantle boundary cause surface deformations and relative sea level variations. Although we cannot establish a clear connection between SAA dynamics and global warming, the strong correlation between the former and global sea level supports the idea that global warming may be at least partly controlled by deep Earth processes triggering geomagnetic phenomena, such as the South Atlantic Anomaly, on a century time scale.”
“… the current warming is happening at least 10 times faster than anything we can find evidence of in Earth’s fossil record…. the current disturbance of the radiative balance is unique at least over the last 20,000 years ….”
John @174 — The length of time series required depends upon the variance to be explained. Records with considerable internal variability require longer records to resolve the underlying trend.
Comment by David B. Benson — 15 Nov 2011 @ 7:26 PM
(#175). So where is Ira Glickstein? …
Comment by MARodger — 14 Nov 2011 @ 11:08 AM
Here I am and here is my answer to:
(# 160). Hi Ira,
I am just curious how one can look at the aggregate of the evidence and not be very concerned about the likely effects of climate change–particularly in a world that will have ~10 billion people and likely no source of cheap energy, fertilizer or organic chemical feedstock.
That one can accept that greenhouse gasses are responsible for ~33 degrees C of warming and think you could double CO2 in the atmosphere without significant warming simply defies logic–as well as evidence.
Comment by Ray Ladbury — 12 Nov 2011 @ 8:02 PM
OK Ray (and MARodger), I accept the basic science of the Atmospheric “greenhouse effect” and that doubling of CO2 levels will, all else being equal, raise mean temperatures by 2-4.5ºC (IPCC) or 0.5-1ºC (some skeptics).
According to the Mauna Loa data, CO2 levels have recently gone up by ~2 ppm/yr, so, to double from the current ~390 ppm would take until ~2200. Even if the rate of increase doubled to ~4 ppm/yr, which is highly unlikely, we would not see doubling until the year 2100.
From the Ice Core record, we know that natural cycles and variations have caused mean temperatures to be much warmer and colder over the past few hundreds of thousands of years. That was prior to the industrial age. It was a period when hominids and early homo-sapiens survived and flourished (along with polar bears and other animal and plant life). Thus, we know that natural cycles and variations not under our control may augment or diminish human-caused warming.
As you point out, if trends continue, populations will grow and sources of cheap (fossil) energy will deplete. Depletion of fossil fuels will self-limit further atmospheric CO2 increases. We have 100 to 200 years to adapt to carbon-neutral or carbon-free energy, and/or the 0.5º – 4.5ºC rise in mean temperatures we may see in the next century or two.
Am I concerned about fossil fuels and do I want us to do something about it? YES! Since 2003 my wife and I have shared a single automobile (a hybrid Prius) and an electric golf cart, I do 40-50 miles on my bicycle each week, we have an energy-efficient home with extra insulation and a programmable thermostat, we recycle to the max, etc. But frankly, when it comes to fossil fuels, I am more concerned about our dependence on unstable foreign sources and the blood and treasure we have to spend to assure access than about the long-term dangers of a bit of warming.
As for government mandates and subsidies, I used to support action on biomass, solar, and other alternative energy. However, I now realize the government is totally incapable of choosing the winners in a responsible way. Ethanol, as we now know, was a give-away of our money to powerful agricultural interests and corn-growing states, and has hardly saved any net CO2 while dropping my Prius MPG substantially. More recently, the half a billion dollar loan guarantee dropped down the Solyndra hole has political overtones.
That is why I (along with James Hansen :^) have long favored a revenue-neutral carbon tax, charged at the mine, well or port of entry, where it will be efficient to collect and hard to cheat on, with the proceeds returned, on an equal basis, to every legal citizen. That will boost the net cost of energy for those of us who use more than the average, and put money in the pockets of those who use less. High-carbon industries will have to raise prices and thus lose business and profits, which will motivate them to make the most rational choices and use carbon-neutral/free sources when that makes economic sense. Greedy self-interest will pump money into alternative energy research and development more efficently than political payoffs.
I think that makes more sense than having politically-connected special interests wreck the economy on the quesionable basis of fears of future warming that may or may not occur a century or two from now.
I hope this answers your valid and thoughtful questions, Ray, and I look forward to further constructive and respectful cross-discussion with you, MARodger, and other members of the RC community.
Ira, First, the preponderance of estimates–and evidence–favor a sensitivity of 3 degrees per doubling. If this estimate is incorrect, there is far more probability that it is higher than that it is lower. Second, there is the fact that CO2 will not increase linearly, but rather exponentially, increasing at least as quickly as energy consumption, doubling every 30-40 years. What is more, with China now growing rapidly, and India and Brazil not far behind, growth in energy may itself grow.
I commend you on your life decisions and on your decision to accept the science and propose solutions rather than denying the facts as your buddies at WTFUWT do. I do, however, think that the risks are far higher than you evidently do. I also see little or no progress toward either developing a sustainable energy infrastructure, nor toward energy independence. What I see is a species in denial and a future generation whose future I don’t envy.
For the record, Ira Glickstein @202, I’m with Ray Ladbury on this issue.
Who are these “some skeptics” whose work you rate as highly as the IPCC? In your original version, these were “other experts” – do these experts also ignore pre-2011 emissions when discussing the doubling of CO2? And you mention that during ice-ages mankind “survived and flourished.” I’m not sure of the relevance. Is this your vision of mankind’s future, to survive and flourish in a returned stone age?
And if you are minded to reply to this, Ira Glickstein, do remember RealClimate is about climate science which requires a tad more discipline than “reasonable” scepticism.
Ira Glickstein wrote: “More recently, the half a billion dollar loan guarantee dropped down the Solyndra hole has political overtones … political payoffs.”
Solyndra has “political overtones” only because it has been dishonestly politicized by those who wish to undermine the rapid growth of the solar industry — which is, in fact, the fastest growing industry in America and the fastest growing source of new electricity generation on Earth. There is no evidence of any “political payoffs” in the Solyndra loan guarantee.
Ira Glickstein wrote: “We have 100 to 200 years to adapt to carbon-neutral or carbon-free energy”
Not according to the International Energy Agency, whose just released 2011 World Energy Outlook calls for urgent action to reduce GHG emissions within the next FIVE YEARS, which finds that:
“Without further action, by 2017 all CO2 emissions permitted in the 450 Scenario [ie. limiting peak CO2 levels to 450 ppm, thereby limiting temperature rise to 2C] will be ‘locked-in’ by existing power plants, factories, buildings, etc … If stringent new action is not forthcoming by 2017, the energy-related infrastructure then in place will generate all the CO2 emissions allowed in the 450 Scenario up to 2035, leaving no room for additional power plants, factories and other infrastructure unless they are zero-carbon, which would be extremely costly. Delaying action is a false economy: for every $1 of investment avoided in the power sector before 2020 an additional $4.3 would need to be spent after 2020 to compensate for the increased emissions.”
The IEA adds:
“We cannot afford to delay further action to tackle climate change if the long-term target of limiting the global average temperature increase to 2°C, as analysed in the 450 Scenario, is to be achieved at reasonable cost. In the New Policies Scenario, the world is on a trajectory that results in a level of emissions consistent with a long-term average temperature increase of more than 3.5°C. Without these new policies, we are on an even more dangerous track, for a temperature increase of 6°C or more … rising fossil energy use will lead to irreversible and potentially catastrophic climate change.”
Two additional points:
First, as others have already pointed out, the claims of extremely low climate sensitivity by “some skeptics” are not credible and are inconsistent with empirical observation.
Second, in 2010 anthropogenic GHG emissions rose at the highest rate ever recorded, to the highest levels ever recorded, exceeding even the worst-case emissions scenarios contemplated by the IPCC. Without urgent action to reduce emissions, any hope of avoiding the most catastrophic consequences of AGW is rapidly fading.
The individual energy conservation measures that you describe are all well and good, but they are not sufficient — government and corporate policies to reduce GHG emissions (and indeed to make it easier and more affordable for individuals to take the actions that you describe) are essential, as the IEA makes clear.
Comment by SecularAnimist — 16 Nov 2011 @ 11:32 AM
Comment by Susan Anderson — 16 Nov 2011 @ 12:25 PM
THANKS Susan Anderson (#206), SecularAnimist (#205), MARodger (#204), and Ray Ladbury (#203) for your serious comments and questions. (Sorry for the length of this comment, but I am answering four postings at once.)
Susan, THANKS! I watched the amusing SolyndraGateApocalypse video and agree that there should be absolutely NO subsidies that amount to government picking winners. As the video points out, the lion’s share always goes to the most established interests with the most lobbyists, namely Big Oil, Big Power, Big Labor, and energy-rich states.
Solyndra is a nit (if half a billion can be so described) but appears more eggregious because it went down the hole so soon after we taxpayers put money into it. These bad investments of public money (yours and mine) usually have the courtesy to wait till the next administration before going bust.
So, let us agree to let greedy, self-interested market forces, spurred on by an umbrella over carbon free/neutral energy provided by the Hansen/Krauthammer/WSJ Revenue-Neutral Carbon Tax, rather than letting the government pick winners. And, as I (and Hansen) describe it, it is a redistributive tax, taking from big energy users (usually the rich) and giving to small (usually the poor).
SecularAnimist, if you don’t see “political overtones” in Solyndra, you must be tone deaf. Just go to CNN for the timeline.
As for the FIVE YEAR deadline in your IEA links, that is for CO2 to reach 450 ppm, which would require the current +2 ppm/yr CO2 rate to increase by an average over the next few years of +12 ppm/yr, which is physically impossible. Yes, I know they are counting the CO2 in the “pipeline” to get to their doomsday estimates, but I do not put much credence in that.
My 200 year estimate is based on doubling from 390 to 780, and my 100 year allows for the CO2 rate to double from the current 2 ppm/yr to 4 ppm/yr.
MARodger: No, I do not want to go back to stone age conditions. My point is that hominids and early homo-sapiens (and polar bears and other animals and plant life) survived and flourished over multiple ~100,000-year cycles of much higher- and lower-than current temperatures and CO2 levels. Given our science and technology, almost totally lacking in ancient times, we will adapt. As the history of the Roman and Medieval warm times, and the Little Ice Age sunspot minima teaches us, environmental changes are adapted to. Some populations gain and others lose, with a net benefit to human civilization.
Ray Ladbury: In my comment, I allowed a CO2 sensitivity range of 0.5ºC to 4.5ºC (average = 2.5ºC, pretty close to your 3ºC estimate).
The fact is, that despite continued rapidly rising CO2 levels, natural variability seems (taking the midline of the error bounds) to have reduced the rate of warming over the past decade and a half to very near zero. Of course, it is possible for that to happen with CO2 sensitivity at 3ºC or even 4.5ºC, but quite a bit more likely if the actual sensitivity is on the lower end of the reasonable range.
The way the IPCC gets above 1ºC sensitivity is to assume that the net feedback effect of clouds (i.e., increased water vapor due to the warming) on temperatures is positive. If it is neutral, the calculations favor 1ºC and if net negative, lower sensitivities. Either side could be right. Time will tell.
I allowed that CO2 rise might double from the current 2 ppm/yr to 4 ppm/yr, which is how my ~200 year window for doubling got reduced to ~100 years.
Your most recent comment says that CO2 will grow exponentially “doubling every 30-40 years”. But in a previous comment, you wrote “…particularly in a world that will have ~10 billion people and likely no source of cheap energy, fertilizer or organic chemical feedstock.” [Emphasis added].
I assume by “cheap energy” you mean fossil carbon. So won’t carbon energy be depleted by the time the world population passes ~10 Billion (several decades at current rates)? Won’t lack of carbon energy sources retard and eventually reverse the growth of CO2 levels in half a century or less? So how can they double more than once, which I allowed for in my 100 year low bound on CO2 doubling? Where did I go wrong on this rather simple math?
Oh, Ray, I am a Guest Contributor at WUWT – you seem to have made a typographical error on that acronym :^)
On his WUWT post Ira Glickstein says: “The point of this posting is that, whatever the difficulties, it is possible for skeptics to post over at RC, so long as we are not too blatant about it, and if we are not too sensitive about our words being edited.”
Well, after pointing out factual errors in claims made by Anthony at WUWT, none of my comments have been posted on that blog for about a year.
Ira Glickstein wrote: “As for the FIVE YEAR deadline in your IEA links, that is for CO2 to reach 450 ppm, which would require the current +2 ppm/yr CO2 rate to increase by an average over the next few years of +12 ppm/yr, which is physically impossible. Yes, I know they are counting the CO2 in the ‘pipeline’ to get to their doomsday estimates, but I do not put much credence in that.”
Either you did not read the IEA report, or you did not understand it, because your account of it has nothing to do with what the report actually spells out.
And it is evident from all of your comments that you “put credence in” whatever claims support your predetermined conclusions, no matter how nonsensical or contrary to observed facts those claims may be.
Ira Glickstein wrote: “if you don’t see ‘political overtones’ in Solyndra, you must be tone deaf. Just go to CNN for the timeline.”
Yes, I am “deaf” to dishonest politicization of the Solyndra matter. The “timeline”, of course, begins with the Bush administration’s selection of Solyndra to receive a DOE loan guarantee, and the Bush administration’s efforts to push that loan guarantee through before Obama was even sworn in. Which, presumably, the Bush administration did to reward an Obama campaign contributor.
Sea-level rise due to climate change could cripple the city in Irene-like storm scenarios, new climate report claims.
The report, commisioned by the New York State Energy Research and Development Authority, said the effects of sea level rise and changing weather patterns would be felt as early as the next decade.
(# 208). On his WUWT post Ira Glickstein says: “The point of this posting is that, whatever the difficulties, it is possible for skeptics to post over at RC, …”
Well, after pointing out factual errors in claims made by Anthony at WUWT, none of my comments have been posted on that blog for about a year.
Comment by Andrew W — 16 Nov 2011 @ 2:35 PM
Andrew W, I am not a Moderator at WUWT, so I do not know if you were on some sort of no fly list over there. However, just today they posted your comment on my current WUWT topic where you claim WUWT “…simply bans people who can point out factual errors in claims made here.” The WUWT Moderator replies, “[Not sure about that, but I am under no such instruction. I will however snip personal abuse...]”
OK, your most recent comment was passed and the WUWT Moderator says your future comments will pass if they do not contain “personal abuse” which seems like fair dinkum to me.
My (unsolicited) advice is to try to post over there again, on topic with a “just the facts ‘mam” demeanor. While I have complained a bit about RC in my postings to WUWT, over here at RC I have scrupulously avoided any reference to those issues, and have been (almost excessively) courteous. If you post in that manner at WUWT, I suspect you will get through. If you think they have blocked any of your postings, please send a copy to me, Ira@techie.com, and I promise to look into it.
[Response: Discussing comment moderation issues at any blog are extremely tiresome. No more on that please. - gavin]
As it happened, by pure luck, Irene followed a path that spared New York City a massively destructive inundation (Vermont got clobbered instead).
The streets and buildings of New York City don’t have to be submerged for the place to become uninhabitable. All it takes is for the sewer system to be flooded. Millions of people in one of the most densely populated cities in the world — and all the toilets backing up. Not a pretty picture.
For Ira Glickstein — you said, or suggested, that you’re an engineer (back when you thought ten years was sufficient to determine a trend for annual climate data, see above). You got that straightened out, I think.
You might want to read something from the engineering side on CO2 amounts over time. This may help. http://www.nap.edu/catalog.php?record_id=10798
Authors: National Academy of Engineering, National Research Council
“… a simple quantitative estimate of when the greenhouse problem will become dangerous. It won’t be next year—but when? If we assume the greenhouse problem will become serious when the carbon dioxide (CO2) concentration in the atmosphere reaches twice the preindustrial concentration, it will happen sometime in the second half of this century, if current trends continue. Is doubling … entering the danger zone? Some have proposed a lower figure…”
As others pointed out above, your understanding of the numbers seem a bit off in this regard as it was on detecting trends in noisy data. There are likely clearer expositions available; possibly even in the FAQ section; someone may have a suggestion.
(Folks, remember what we know works — don’t quote wrong information — point to correct information sources and encourage the readers to work through the examples, eh?)
Ira, First, you speak as if all values between 0.5 and 4.5 are equally probable. They are not. The probability that sensitivity is greater than 3 is significantly greater than the probability is less. The probability that sensitivity is less than 2 degrees per doubling is less than 5%, and the probability that it could be as low as 1 degree is nil. It is virtually impossible to get the climate system to look Earthlike if sensitivity is less than 2.
And no, cloud feedback in the models is not positive because it is assumed to be, but rather because the evidence suggests that it is positive.
As to your mathematical error, consider this: if the increase in energy consumption is x% per year, and current consumption is A, then the increase in consumption dA=x%Adt, A will increase as exp(xt/100). The data show global energy consumption icreasing at about 4.6-5% per year and so doubling every 14-15 years.
I’ve done the math, and there is enough coal, oil, natural gas, tar sands, etc. to quadruple the CO2 content of the atmosphere to over 1000 ppmv. This would bring us, with high probability, to ~6 degrees above preindustrial levels–puite possibly this century. This is well into the danger region.
Chris R., There are no solutions at present. None. Merely reducing consumption, as I said, is not a solution, but merely an exercise in buying time. It is by no means certain that we can even find solutions in time to avert severe consequences. The fact that the problem is difficult makes it all the more criminal that fossil fuel interests have successfully prevented society from even acknowledging the problem, let alone working toward solutions.
Comment by Ray Ladbury — 5 Nov 2011 @ 6:52 PM
Bah-humbug, Ray. Of course there are solutions. And why is reducing consumption not one of them? You could use some edumacation in regenerative design.
Hansen says reducing extraction and use of FFs and improving agroforestry and farming to regenerative practices, we can below 350 within decades. (Page 15, section 31. NOTE: I thought I saw a number for forestry in one of his publications not long ago.): http://www.columbia.edu/~jeh1/2010/201010_BluePlanet.pdf
The way the IPCC gets above 1ºC sensitivity is to assume that the net feedback effect of clouds (i.e., increased water vapor due to the warming) on temperatures is positive. If it is neutral, the calculations favor 1ºC and if net negative, lower sensitivities.
You’ve got some basic stuff confused here. Cloud is not water vapor, and cloud feedback is not identical with water-vapor feedback. The latter is beyond a doubt positive, as water vapor is a powerful greenhouse gas. Increased water vapor also comes with a negative feedback called the lapse-rate feedback. Scientists have a pretty good handle on their combined effect. What remains uncertain is the magnitude of the cloud feedback.
You don’t need positive cloud feedback to get sensitivity above 1 °C. Given neutral cloud, the water-vapor (less lapse-rate) and surface albedo feedbacks should get you to 2 °C (AR4, I: 184.108.40.206). And neutral cloud does not appear to be what we’re given.
“From the Ice Core record, we know that natural cycles and variations have caused mean temperatures to be much warmer and colder over the past few hundreds of thousands of years. That was prior to the industrial age.”
We also know that the periodicity of these natural cycles and variations is in the range 20,000 – 100,000 years.
“It was a period when hominids and early homo-sapiens survived and flourished (along with polar bears and other animal and plant life).”
Actually it was a period during which most of the Earth’s megafauna disappeared, although that is in my view probably due mostly to human activity rather than climatic change. But the implication is that climate changes of lesser magnitude are nothing to worry about. This is breathtakingly complacent: global ecosystems are already severely stressed by the presence of seven billion people, likely to grow to at least 9.5 billion, changes in dietary habits toward greatly increased global meat and dairy consumption, and the depletion of crucial resources such as fresh water, soil, fish, available phosphorus and key metals. Moreover, while Homo sapiens as a species flourished, we simply have no way of knowing how much human suffering resulted from climate change; what is clear is that in the modern world, quite small climate changes would threaten the livelihoods of hundreds of millions of people.
As a ballpark figure, recent annual rises in CO2 levels have roughly doubled in the last 40 years, (See a graph of recent CO2 levels rises here) and that is 40 years of big rises in fuel price. Such a doubling every 40 years will result in 800ppm or so by 2100.
What to make of the responses of Ira Glickstein @207 to my own questioning @204?
It seems that climate sensitivity may turn out to be in the range 2.0-4.5 (UN IPCC) or 1.0-0.5 (some unnamed skeptics) but it doesn’t matter. Even if there is a tripling of atmospheric CO2, it doesn’t matter a jot because whatever the climate throws at us mankind will adapt, mankind will survive and flourish. The proof of this is that a tiny ignorant population of cave men survived and flourished in the face of past humongous ice ages despite lacking any modern technology. If that wasn’t enough proof, history teaches us that mankind adapts to climatic changes. Medaevial warm times, Little Ice Ages – easy peasy. Okay there were some losers but such changes brought a net benefit to mankind’s adaptable civilisation.
Wow! Perhaps, if this is true, we should be adding even more to the climate change so as to reap more of those wonderful ‘net benefits’.
In summation, Ira Glickstein avoids two seemingly straightforward questions while explaining that manifest destiny will ensure it all works out just fine for us. He forgets that RealClimate is about climate science not his usual brand of “reasonable” skepticism.
“radar data has indicated the base of the sheet has been severely disrupted by water that has been freshly frozen, layer upon layer, on to the bottom of the ice column. ANTARCTIC GAMBURTSEV PROJECT (AGAP)”
“UNEXPECTED BASAL CONDITIONS UNDER ANTARCTIC ICE STREAM C DISCOVERED WITH A NEW BOREHOLE VIDEO PROBE
Barclay Kamb and Hermann Engelhardt (Caltech),
Frank Carsey, Lonne Lane, and Alberto Behar (JPL)
In a recent study of Anarcticas Ice Stream C, a joint NSF-NASA research team has discovered a surprising gap between the base of the ice stream and the rock below. The gap is up to 1.4 m (60 inches) wide, and is filled with water at a pressure nearly high enough to lift the ice stream off its bed. The discovery is significant in relation to the mechanism for rapid movement of the ice streams, which are huge, fast-flowing ice currents within the slow-moving ice sheet that covers most of Antarctica. The mechanism for their rapid movement is being intensively studied because of the possibility that rapid ice-stream flow may cause the ice sheet to disintegrate, resulting in a disastrous rise in world-wide sea level….the width of the gap … = 1.5 m (59 inches)…. The horizontal dimensions of the gap were beyond the illumination from the probe’s sideward-shining floodlamp, which is thought to be about 0.5 m under the conditions of the water clarity that prevailed in borehole no.3.
The reason why I am more pessimistic than you is that I define the problem more broadly than simply climate change. In my mind it is a combination of rising population, rising expectations/development and resource depletion, along with the challenges posed by solutions to these issues. It is pretty much inevitable that population will rise to at least 9 billion, and probably closer to 10 billion by around mid century.
The key to holding it to these levels is increased development, and that will entail increased consumption by the majority of humanity. If their standard of living does not rise dramatically, fertility rates will not fall. High fertility and high poverty will itself pose serious environmental threats–complete deforestation, desertification, depletion and consequent loss of aquifers and other groundwater, growth of ocean dead zones and so on. To avoid these threats, consumption will actually have to rise, but given the lack of renewable energy alternatives, this will worsen climate change, further deplete critical resources and probably decrease overall soil fertility and agricultural productivity.
Now let us assume that everything goes according to plan. We hold population to 10 billion and it slowly starts to decrease. Then we face all the challenges of an aging and falling population (viz. Japan) but on a global scale. The only way to deal with these issues is increased productivity (per person). This, too will require increased energy consumption (unless we can figure out how to harness Rosenfeld’s law the way we did for Moore’s law).
Now, of course, all this would be different if we’d started to tackle these problems in the mid ’70s as we should have, rather than another 40 years of “Bidness as usual”, but we didn’t. It is distinctly possible that now we are too late to avoid all but the absolute worst of outcomes.
Kevin McKinney (#222), MARodger (#221, #219), Nick Gotts (#220, #218), CM (#217), Ray Ladbury (#215), and Hank Roberts (#214) – THANKS for your comments and questions.
Hank Roberts Statistical significance is a well-established concept that I used during my multi-decade career as a System Engineer and in my System Science PhD work. It was first brought into the RC Times Atlas thread when I paraphrased Phil Jones, former head of the UK CRU. You are correct that, in the case of mean surface temperature, a decade of data is insufficient to determine that the conclusion has less than a 5% chance of being wrong. However, that does not prevent us from looking at the midline of the temperature record error bounds and noticing that the rise seems to have flattened out over the past decade and a half. That could be a mistaken conclusion, but it is still more probable than that the rise remained the same as the previous decade.
Kevin McKinney: There are no guarantees, but the past is usually something of a guide to the future. We (humanity) will either cut fossil fuel use drastically -or- we will learn to adapt to some rise in mean temperatures -or- we will have another Little Ice Age, or something between those extremes. Given political and economic realities, which do you think is more likely?
MARodger: I did not say everything will work out fine for all of humanity “easy peasy”. Lots of people starved during the Little Ice Age and lots were killed by the Roman Legions during the Roman Warm period. Do you think humanity will drastically cut fossil fuel use any time soon? If not, we will have to adapt to climate change as we have in the past.
You are correct that the rate of rise in CO2 has gone from about 1 ppm/yr to about 2 ppm/yr over the past 40 years. If that rate of rise continues (unlikely due to depletion of cheap fossil fuels), it will be 4 ppm/yr in 2050 and 8 ppm/yr in 2090. You conclude that that will result in a CO2 level of up to 800 ppm in 2100. In my original comment, I said CO2 could double from 390 to 780 by 2100, so we are on the same page.
Nick Gotts: Warming causes relatively more water vapor in the Atmosphere, causing relatively more clouds. Yes, I know when water vapor condenses into clouds it reflects incoming Sunlight, so daytime clouds have net negative feedback, nighttime clouds have positive feedback, and both differ from uncondensed water vapor. There is net negative feedback from precipitation and storm events, and so on. The key issue is that additional water vapor, due to additional warming from rising CO2 levels, has an effect on CO2 sensitivity. Absent the water vapor/clouds/precipitation effects, CO2 sensitivity would be about 1ºC. IPCC assumes net positive feedback and gets 2ºC-4.5ºC. Some skeptics assume net negative feedback and get less than 1ºC.
You say “…Actually it was a period during which most of the Earth’s megafauna disappeared, although that is in my view probably due mostly to human activity rather than climatic change…” Are you claiming that early homo sapiens, over 100,000 years ago, killed off most of the large animals, and it was not due the cold climate?
CM: Estimates of CO2 sensitivity are all over the place, with even the IPCC giving a range of more than two to one. Time will tell.
Ray Ladbury: You earlier said, given current trends, cheap energy (which I assume means fossil carbon) would not be available when the population reaches ~10 Billion. You now say there is enough to raise CO2 to over 1000 ppm, raising temperatures by ~6ºC over pre-industrial levels this century (before 2100). I think that is a stretch, but time will tell. Sadly you and I will not be around to see it.
Hank Roberts: In my earlier comment, I did say CO2 levels could double as early as 2100, and you quote a study that says sometime in the second half of this century, which means between 2051 and 2100. Possible, but not likely due to depletion of cheap fossil fuels and consequent development of effective carbon-neutral/free energy in response to market forces. But I could be wrong.
I am writing a little article about civilization in 2150. At this time, I predict a WW population of abot 500 million, as nothing of substance gets done about CO2 levels until 2050. Comments appreciated — email@example.com
“If stringent new action is not forthcoming by 2017, the energy-related infrastructure then in place will generate all the CO2 emissions allowed in the 450 Scenario up to 2035, leaving no room for additional power plants, factories and other infrastructure unless they are zero-carbon, which would be extremely costly.”
So, change *BY 2017* is needed to avoid 450 ppm *BY 2035.* (Cross-checking, 2 ppm x 24 years gives an increase of 48 ppm over today’s 392, or 440 ppm. Not much acceleration needed, then, to hit 450 by 2035–and 24 years for carbon cycle feedbacks to kick in.) Yeah, you could say that’s “CO2 in the pipeline,” but there is clearly a limit to how fast you can replace massive amounts of expensive (and essential) infrastructure.
This is the problem in a nutshell–we need to act well BEFORE the consequences are obvious in order to avoid the worst damage.
(IMO, we’re already too late to avoid significant damage; for instance, I think there’s essentially no chance of preserving September Arctic sea ice at this point. Feedbacks from that eventuality seem not to be well-characterized, as far as I can tell, especially in terms of atmospheric circulation.
(But, hey, it could be worse than just losing September sea ice. Much worse, actually–6 or even 8 C warming over the planet as a whole, instead of (as at present) just over the mid-Arctic. Doing nothing would be a good way to get there.
(By the way, a “back of the envelope” question–what, I casually wonder, would be the CO2 feedback effect of, say, warming nearly the entire surface of the Arctic Ocean by (arbitrarily) 5 C for a month, assuming it were nearly all open water?)
Re: john burgeson says:
17 Nov 2011 at 10:52 AM
Here is something just up your street: The other day I extrapolated the temperature data for Central England region up to 2150. It is based on the currently available CET data, with 0.25C/century rise applied to the oscillation (the average during last 350 years).
I was surprised by the result: http://www.vukcevic.talktalk.net/CET-NVa.htm
the extrapolation is based on this more detailed graph: http://www.vukcevic.talktalk.net/CET-NV.htm
Natural variability law is derived directly from the CET data and may be applicable to the North Atlantic only.
The projection is confirmed by my solar formula too, devised some 8 years ago, http://www.vukcevic.talktalk.net/NFC7a.htm
when the NASA was predicting high solar activity for the most recent cycle. To my astonishement this turn out ok. It is based on the little known polar magnetic field data, as shown here: http://www.vukcevic.talktalk.net/LFC2.htm
I have no opinion on any of this; I just apply data and put results on line.
Fiction? May be, so is your article, I assume.
Ira (#228, in response to my #217 and Nick’s #220),
Sorry, you don’t get to go “estimates… are all over the place… even the IPCC… time will tell.” That’s not the point. The point is that you goofed by conflating cloud feedback with water vapor feedback. These are best kept distinct, for various reasons. In particular, the uncertainty of the combined water vapor/lapse rate feedback is not nearly as wide as the cloud uncertainty—as your statement would suggest to the unwary reader that it is.
Moreover, you goofed when you claimed that only positive cloud feedback accounted for IPCC’s climate sensitivity estimate being over 1 °C. I showed you chapter and verse of the IPCC report where it clearly says otherwise. The IPCC no-feedback point estimate of one degree was good enough for you when you were misrepresenting it, so when your misrepresentation is pointed out, it really will not do to reply that the IPCC gives a range anyway. Besides, the range it gives is clearly stated in the reference I gave you: “…in the absence of cloud feedbacks, current GCMs would predict a climate sensitivity (±1 standard deviation) of roughly 1.9°C ± 0.15°C (ignoring spread from radiative forcing differences).” That range does not support your claim at all.
C’mon, stop swerving and say it, “I goofed.” You’ll feel better, and we won’t be tempted to impugn your learning ability or, heaven forbid, your honesty.
Ira Glickstein @228. Your position is changing by the day. @202 we had ice age men entering the discussion for reasons that were beyond me. They reappeared @207 along with some Romans and looked to be presented as evidence of man’s ability to adapt to climatic changes. “As the history …… teaches us, environmental changes are adapted to. Some populations gain and others lose, with a net benefit to human civilization.”
Now @228 you are strongly questioning man’s ability to avoid climate change, that humanity will thus be forced to adapt and emphasising that it will not be fine for everyone – no sign of ice age men & no mention of ‘net benefit’ although the walk-on straw man was shot down.
Perhaps it would be best to concentrate on straightforward things that cause less confusion. What of the “some skeptics” (that you initially termed as “other experts”), those who calculate climate sensitivity at 1.0 to 0.5 deg C – they remain still unnamed although I note you cite these “other experts” again here @228. And are these “other experts” responsible for you’re ignoring pre-2011 emissions when considering doubled CO2? Answers still awaited on these from @204. These are answers that should be easy peasy.
“We (humanity) will either cut fossil fuel use drastically -or- we will learn to adapt to some rise in mean temperatures -or- we will have another Little Ice Age, or something between those extremes. Given political and economic realities, which do you think is more likely?”
Well, #3 is unlikely any time soon, I think, so I’ll say that a combination of #1 and #2 is most likely.
I’m advocating as forcefully as I know how that we do as much of #1 and as little of #2 as possible, since available evidence indicates that this will be most likely to bring about the least deleterious results.
How about you, since we’re getting all friendly now?
john burgeson @229 — Too large by a factor of at least 50.
Comment by David B. Benson — 17 Nov 2011 @ 5:41 PM
According to this, at the time written, the anthropogenic component of atmospheric CO2 was doubling every 31 years. It’s currently ~110 ppm, so it would reach ~220 ppm by 2042, and ~440 ppm by 2073. Atmospheric levels of ~500 ppm in 2042 and ~720ppm in 2073.
A lot of people around the net are claiming this is simply impossible. Ray Ladbury indicates it is possible.
Is there enough FF to do it? I do not think FF will have to be cheap in order to do it.
“Hyperwarming climate could turn Earth’s poles green”
“…In particular, the release of methane from melting Arctic permafrost has not yet been factored in. Methane is a potent greenhouse gas, but remains in the atmosphere for only 10 years on average before it reacts with hydroxyl radicals in the air to form CO2. However, a large release of methane from melting permafrost could swamp the hydroxyl supply, allowing the methane to linger in the atmosphere for 15 years or more, further amplifying the warming (Global Biogeochemical Cycles, DOI: 10.1029/2010GB003845).
Some feedbacks never before considered might also come into play. Pearson says that in the future oceans may store less carbon. Normally some atmospheric carbon is lost at sea, buried in the carcasses of tiny marine animals. But sediment from the Eocene contains little carbon, suggesting that this process failed during the last hothouse (Paleoceanography, DOI: 10.1029/2005PA001230).
To work out why, Pearson looked at fossils of foraminifera, microscopic shelled marine animals. The tiny shells contain a chemical record of the position the animals occupied in the water column when they were alive. He found that Eocene foraminifera lived closer to the ocean surface than they do today, suggesting there was little food to sustain deeper-dwelling species.
Pearson thinks the warmer temperatures allowed bacteria at the ocean surface to metabolise faster, recycling carbon before it could sink and feed foraminifera living at depth. “If we warm the planet now, we switch on our bacteria,” he said last month at a Royal Society discussion meeting in London.
A warming climate will also see trees and other large plants spreading north into the Arctic, says Bette Otto-Bliesner of the US National Center for Atmospheric Research in Boulder, Colorado, who also attended last month’s Royal Society event. Plants are darker than snow, so they absorb more of the sun’s radiation. When Otto-Bliesner plugged the effect into a climate model of the Arctic, it got 3 °C warmer.
Then there’s hyperwarming. Ed Landing of the New York State Museum in Albany coined the term to describe the spiralling temperatures seen during the Cambrian period as a result of rising sea levels.
Vast areas of the continents were covered with shallow seas during the Cambrian, which began 542 million years ago, because sea levels were sometimes tens of metres higher than today. Sea water absorbs more of the sun’s heat than land, so swamping the continents caused the planet to warm up even more. Sea temperatures reached 40 °C and oxygen levels in the water crashed …”
wili, we don’t know how far this warming will go because we don’t know how soon The Revolt Of The Humans will put an end to Big Carbon’s diabolical desires. Take action to increase the reality oriented community.
Comment by Pete Dunkelberg — 17 Nov 2011 @ 11:12 PM
> ocean surface, bacteria
Not news, except in the sense of very bad news already warned of.
Cassandra file item.
Shu-zhong Shen1,*, James L. Crowley2,3, Yue Wang1,*, Samuel A. Bowring2,*, Douglas H. Erwin4,5, Peter M. Sadler6, Chang-qun Cao1, Daniel H. Rothman7, Charles M. Henderson8, Jahandar Ramezani2, Hua Zhang1, Yanan Shen1,9, Xiang-dong Wang1, Wei Wang1, Lin Mu1, Wen-zhong Li1,10, Yue-gang Tang11, Xiao-lei Liu1,12, Lu-jun Liu1, Yong Zeng13, Yao-fa Jiang14, Yu-gan Jin1
The end-Permian mass extinction was the most severe biodiversity crisis in earth history. To better constrain the timing, and ultimately the causes of this event, we collected a suite of geochronologic, isotopic, and biostratigraphic data on several well-preserved sedimentary sections in South China. High-precision U-Pb dating reveals that the extinction peak occurred just before 252.28 ± 0.08 Ma, following a decline of 2‰ in δ13C over 90,000 years, and coincided with a δ13C excursion of -5‰ that is estimated to have lasted ≤20,000 years. The extinction interval was less than 200,000 years, and synchronous in marine and terrestrial realms; associated charcoal-rich and soot-bearing layers indicate widespread wildfires on land. A massive release of thermogenic carbon dioxide and/or methane may have caused the catastrophic extinction.
john burgeson @229: I claim no particular expertise in that area, but I would not quibble with your estimate of future population size. A concern I have with some of the higher estimates I’ve seen is that they appear to focus on one element e.g. demographic transition, irrigation water) and ignore other, interacting elements (e.g. cost of energy, plastics and N fertilizer, food fish supply, climate change). I would be happy to be put right if anyone knows of a decent attempt to consider all factors when looking at likely sustainable population size.
How do you envision the population size reducing? Is it by birth control measures, mass regional starvation, global epidemics, civil war or invasion and genocide, or a mixture of all of these? These would drastically alter the attitudes and skill sets of survivors, as well as impacting on the infrastructure.
Comment by Richard Simons — 18 Nov 2011 @ 11:30 AM
Ira, I said there was sufficient carbon to raise CO2 levels above 1000 ppmv. I didn’t say that would be cheap carbon. I don’t expect people to stop burning fossil fuels merely because they become more expensive. In parts of India, they burned wood ’til there were no trees left.
(#242) Ira, I said there was sufficient carbon to raise CO2 levels above 1000 ppmv. I didn’t say that would be cheap carbon. I don’t expect people to stop burning fossil fuels merely because they become more expensive. In parts of India, they burned wood ’til there were no trees left.
Comment by Ray Ladbury — 18 Nov 2011 @ 3:17 PM
As any commodity becomes more expensive (in this case cheap fossil fuels), prices go up and that provides an incentive and market for alternatives (in this case more efficient use, recycling, and carbon free/neutral fuels). So, while I have no doubt, Ray, there is enough carbon in the ground to raise CO2 levels above 1000 ppmv, it is unlikely to ever happen due to the cost. Once the cost of carbon rises above the cost of non-carbon alternatives, market forces will cause the most intelligent consumers and producers to switch. The problem with having government pick the winners is that they always favor established special interests.
James Hansen and I favor a revenue-neutral carbon tax with all proceeds returned to legal citizens on an equal per-capita basis to speed the effect of these market forces by boosting the cost of carbon fuels and thereby encouraging consumers and industry to improve energy efficiency and expand the market for solar, biomass, water, nuclear, and/or whatever should be the real winners. Does anybody know what the eventual market winners will be? It may be a source you and I never heard of.
(#203, #204, #233, #234) Ray, MARodger and others ask where I got my lower limit for CO2 sensitivity (0.5ºC to 1ºC) and the related issue of the treatment of cloud feedback in mainstream climate models.
Here is a .pdf of a 2009 paper by Lindzen and Choi that appeared in GEOPHYSICAL RESEARCH LETTERS, VOL. 36, L16705, doi:10.1029/2009GL039628, 200.
Richard Lindzen, is an American atmospheric physicist and Alfred P. Sloan Professor of Meteorology at the Massachusetts Institute of Technology. Based on Earth Radiation Budget Experiment (ERBE) data, he and Choi conclude that “…ERBE data appear to demonstrate a climate sensitivity of about 0.5ºC which is easily distinguished from sensitivities given by models.”
An easier to read source for the possibility that CO2 sensitivity has been over-estimated due to how the mainstream models treat cloud feedback is Dr. Roy Spenser’s blog.
You all may read the paper and the blog and agree or not. I do not have sufficient expertise to judge.
[Response: Your expertise would surely allow you to note that Lindzen and Choi had many errors - even acknowledged by Lindzen - and expounded on by Trenberth et al. Indeed, even Roy Spencer took issue with their claims. - gavin]
Ira Glickstein @243. You present a single paper by “some skeptics” (Lindzen & Choi 2009) to give authority to your assertion that climate sensitivity has been assessed by as being in the range 1.0 to 0.5 deg C (as opposed to the range assessed by the IPCC of 4.5 to 2.0 deg C).
This initially appears to be progress. Sure Lindzen & Choi 2009 carries very little authority (in comparison with the IPCC) being only preliminary research (which subsequent analysis has found wanting) and that it is only addressing a part of global sensitivity. But it would at least carry some authority.
Yet when I examine the paper there no mention of your 1.0 to 0.5 deg C sensitivity. You claim Lindzen & Choi 2009 gives authority to your assertion. As the paragraphs are numbered within the paper, could you point to the particular paragraph that is the source of your 1.0 to 0.5 deg C assertion? (Be warned that there is a limit to my “happy reading” of your references.)
I acknowledge your other reference to Roy Spencer. This is not so happy. Simply pointing at the home page of Spencer’s website is rather discourteous. If you wish to refer us to some of Spencer’s work, you will have to be more specific.
To demonstrate how easy peasy this providing of references should be, here is UN IPCC FAR providing reference for the 2.0 to 4.5 deg C. You will note the reference also defines climate sensitivity to be “the equilibrium global average warming expected if CO2 concentrations were to be sustained at double their pre-industrial values (about 550 ppm).” I still have no answer as to why you @202 introduce the figure 780ppm (ie 2 x 390ppm) to replace the standard 550ppm, a very significant difference.
Ooh! Unfortunate choice. They didn’t even use the right definition for sensitivity! I would note that in the past decade or so, every paper that has gotten an estimate for sensitivity less than ~2 degrees per doubling either relied on a time interval that was too short or was subsequently shown to be wrong–or both. I’d count that as evidence against a small value for sensitivity.
Also, Ira, you ignore the fact that despite rising costs and imperial edict–vast regions of the subcontinent were totally deforested. Demand becomes rather inelastic when survival is at stake.
Gavin, regarding the CRU cyber-attack two years ago, you said
[Response: They [hackers] used something to directly access the backend mySQL database (to export the password/user details to file prior to erasing them in the database) and to monitor logins to the ssh account. Neither of these things are standard WordPress functions. I conclude therefore they must have hacked both, though the actual entry point is obscure. – gavin]
Is there any chance I can get a copy of the malicious code that was doing these things, or at least any further information on the code?
The discussion of methane here reminds me on a RC post about some work of Susan Solomon 2010, about fluctuations of stratospheric water content and possible links with multiyear or decadal variations on the climate – weather verge
but I cannot figure out something unequivocal from that.
Is there any convincing explanation available this time where this fluctuations originate from, i.e. 10% loss of H2O from 2000 to 2010?
So the picture clearer now how important this kind of climate feedback is.
Ira Glickstein @243. You present a single paper by “some skeptics” (Lindzen & Choi 2009) … As the paragraphs are numbered within the paper, could you point to the particular paragraph that is the source of your 1.0 to 0.5 deg C assertion? …
Yes, the last sentence of paragraph 17 includes that estimate, based on analysis of the ERBE data.
… Simply pointing at the home page of Spencer’s website is rather discourteous. If you wish to refer us to some of Spencer’s work, you will have to be more specific. …
You are right. Here is a specific page Global Warming 101 where Spenser makes the point about CO2 sensitivity and positive/negative feedback issues. About halfway down this relatively short write up, he contends that a doubling of CO2 by itself, would directly yield less than 1ºC of warming, which he says is not a controversial statement.
He then goes on to add “… clouds, water vapor, and precipitation systems can all be expected to respond to the warming tendency in some way, which could either amplify or reduce the manmade warming. These other changes are called “feedbacks,” and the sum of all the feedbacks in the climate system determines what is called ‘climate sensitivity’. Negative feedbacks (low climate sensitivity) would mean that manmade global warming might not even be measurable, lost in the noise of natural climate variability. But if feedbacks are sufficiently positive (high climate sensitivity), then manmade global warming could be catastrophic. …”
… I still have no answer as to why you @202 introduce the figure 780ppm (ie 2 x 390ppm) to replace the standard 550ppm, a very significant difference.
Comment by MARodger — 19 Nov 2011 @ 7:06 AM
Once you have an estimate of CO2 sensitivity, which is a logarithmic effect, it should apply to any doubling (within some reasonable range). Thus doubling the current CO2 level, 390 to 780 would boost mean temperatures about the same amount as 275 to 550. I choose 390 because that is where we are at now, and I used that to calculate that the next doubling (where temperatures will be up over current levels by 2-4.5ºC using IPCC estimates) will be reached around the year 2200 if they keep rising at about 2 ppm/yr or about 2100 if the rate of rise doubles to about 4 ppm/yr.
Note that estimates of CO2 sensitivity are uncertain because analysts have quite a bit of what I call “wriggle room” due to the complexity of the climate system, for example a range of 2.5ºC between the IPCC low and high estimates.
My favorite example of this type of analytic uncertainty is a 2007 email from Dr. Makiko Sato to Dr. James Hansen where she details several analyses of the mean US temperatures for the very hot years of 1934 vs 1998. (Click here, then scroll down about 1/5 of the way to a “Page 1 of 22″. An easier to access copy of that email with data in graphic form appears here.)
Her refreshingly candid account shows the 1999 analysis with 1934 0.5ºC warmer than 1998. Subsequent re-analysis, between 2001 and 2007, shows the 1934-1998 difference as: 0.12ºC, 0.036ºC, -0.015ºC, and 0.12ºC. This is clear evidence that analysis of the difference between the means of the thousands of thermometer readings taken in various years has “wriggle room” of up to 0.5ºC. And, please note this is analysis by US scientists of US data, which should be more reliable than worldwide data from countries with lower density of thermometer readings. The latest analysis, done after the date of the Sato email, has 1998 0.12ºC warmer than 1934.
Ira Glickstein @248. I note that my lesson in ‘easy peasy referencing’ @244 fell on deaf ears.
You asserted @202 that “some skeptics” say doubling CO2 would raise global temperatures 1.0 to 0.5 deg C (which you gave apparently equal weight with the UN IPCC’s 4.5 to 2.0 deg C).
To give authority to your statement, @248 you provide two references.
Firstly you pluck the value of 0.5 deg C from paragraph 17 of Lindzen & Choi 2009., a value that is plainly not a global value and plainly speculative within a speculative piece of research. Further Lindzen & Choi 2009 is now superseded by Linzden & Choi 2011 which makes no mention of any 0.5 deg C value. (And it would be wrong not to point out that both these papers have been show to be fundamentally flawed.) I see in all this no support for you assertion.
Secondly you provide quotes from a Roy Spencer reference where again I see no mention of the values you assert. From Spencer you quote a value for direct CO2 sensitivity “… less than 1 deg C …” Spencer qualifies this value “… about 1 deg F …” or 0.55 deg C (Spencer calls this value “not controversial” although I would strongly disagree.).
Frankly, I am not sure what you are trying to say in this. The origins of your assertion @202 remains a mystery. As this is a confusion that you appear incapable of resolving I can but assume you dreamt your numbers up.
Your response @248 that double CO2 can be taken as double today’s value (that is nigh on CO2 tripling!) is pure nonsense if it is included in discussion of the timing of that doubling/tripling, which is exactly what you do. Note that even your Spencer link discusses potentially “catastrophic” warming and the warming of the last 100 years that (along with any future warming associated with a pre-2011 45% rise in CO2 levels) you Ira Glickstein appear to feel can be simply & safely ignored. As was stated back @160 “It simply defies logic.”
“… vertical-axis turbines allow more efficient energy to be produced, in fewer square feet.
By Marcus Y. Woo,
Republished from Engineering & Science
Volume LXXIV, Number 2, Spring/Summer 2011
“… a vertical-axis turbine is less efficient than its monolithic cousin. But taken as a group, they can be positioned to squeeze as much power as possible from a given plot of land…. an array of half a dozen turbines has proven that … “… we can increase the power output by an order of magnitude,” … “it’s not just a theoretical prediction.”
“Glickstein proceeds to fail to distinguish between local and global temperatures, which he then weaves into a conspiracy theory involving NASA GISS: “The most revealing email from GISS…was from Makiko Sato…”
Bunk, rebunked. Since he’s just copypasting his own stuff into the thread here, he’s inviting recreational retyping of the debunking.
While I am not qualified to judge the latest Lindzen/Choi 2011, I did read parts of it and note that they acknowledge and correct errors pointed out by qualified reviewers and they state:
“… An earlier study (Lindzen and Choi, 2009) was subject to significant criticisms The present paper is an expansion of the earlier paper where the various criticisms are taken into account. …
[In the 2011 paper] … we show that simple regression methods used by several existing papers generally exaggerate positive feedbacks and even show positive feedbacks when actual feedbacks are negative. We argue that feedbacks are largely concentrated in the tropics, and the tropical feedbacks can be adjusted to account for their impact on the globe as a whole. Indeed, we show that including all CERES data (not just from the tropics) leads to results similar to what are obtained for the tropics alone – though with more noise. We again find that the outgoing radiation resulting from SST fluctuations exceeds the zerofeedback response thus implying negative feedback. In contrast to this, the calculated TOA outgoing radiation fluxes from 11 atmospheric models forced by the observed SST are less than the zerofeedback response, consistent with the positive feedbacks that characterize these models. The results imply that the models are exaggerating climate sensitivity. [Emphasis added]
Do you (or any other RC reader) have any comment on my link to the email from Dr. Makiko Sato to Dr. James Hansen where she details several analyses of the mean US temperatures for the very hot years of 1934 vs 1998?
Hank Roberts (#253) says I “… fail to distinguish between local and global temperatures…”. I have clearly stated that the Sato email refers only to US mean temperatures. If analysis of US temperature data has that much “wiggle room” it would seem global data from places with lower density of thermometers would have even more uncertainty. Hank also says “it has been “…Bunk, rebunked. … Eschew. …” but he gives no details.
I agree we should eschew obfuscation :^) but I really want to know how that email can be interpreted as other than repeated re-analysis of the same old data to get the right answer.
[Response: Absolute, and outrageous, nonsense. The fact of the matter is that Sato was simply reviewing what had happened as a function of corrections in USHCN and updates to the GISTEMP analysis method. The 2001 paper clearly demonstrates why the bulk of the differences occurred - the TOBS corrections - and none of that is related to some imagined desire to get a specific answer. Please leave your conspiratorial fantasies and implied slanders behind when you comment here - they are not welcome. - gavin]
Ira, Why is it that you assume the uncertainties will all come down on the right side? This has not been my experience in life. The majority of CO2 sensitivity estimates come down around 3 degrees per doubling. Those that do not are more likely to be above this level than below. It is very hard to get a model to look Earth-like with a sensitivity as low a 2 degrees per doubling.
I note that you also neglect the warming “in the pipeline”. Good lord, are you as sanguine about your personal investments or your engineering decisions as well?
Ira Glickstein @254. For myself, I couldn’t be less interested in your link to e-mails. It has taken days and far too many iterations to expose the comments you previously presented on this thread as being riven with unsubstantiated contention, nonsense and ill-defined polemics. And strangely, you don’t exhibit the slightest concern by such characterisation.
You say we should “eschew obfuscation”. Given your record on this thread, perhaps then silence would be your best policy. The process of science is obviously beyond your abilities. Indeed, you now have a website Response accusing you of “conspiratorial fantasies” and I really can say little more on that subject because if I did the shape-shifting lizards would get very very angry.
It is becoming all too common for responses to material to be based on context rather than content.
The scientific method requires that people do their own thinking. Content is what it is, not what some opinionator claims it is.
This creates great difficulties in the conversation, as where people stand has come to matter more than what they have to say.
I’m not sure where I’m going with this, but it stems from the difference between a few recent links I’ve posted in various discussions and the gloss put on that content by those who appear to be speaking to some hidden audience in order to create an impression that contradicts reality. Reality is crowding in on us and we need to pay attention.
Trying to hide truth with tactics may be successful in the short term, but it is destructive and dishonest.
Ahead of critical talks and despite pledge for new treaty by 2012, biggest economies privately admit likelihood of long delay
The UK, European Union, Japan, US and other rich nations are all now united in opting to put off an agreement and the United Nations also appears to accept this.
Developing countries are furious, and the delay will be fiercely debated at the next round of international climate talks beginning a week on Monday in Durban, South Africa.
The Alliance of Small Island States, which represents some of the countries most at risk from global warming, called moves to delay a new treaty “reckless and irresponsible”.
I nominate further posts from Ira Glickstein for the Bore Hole.
As the exegesis of Mr. Glickstein’s work at Skeptical Science makes clear, he is a pseudo-skeptic who has gone so far as to publish a PowerPoint presentation on the best “skeptic strategy for talking about global warming” based on “reasoned views” — which are in fact nothing more than politely and “reasonably” presented regurgitations of the usual, repeatedly debunked and utterly tiresome denier myths, falsehoods, conspiracy theories, pseudoscience and sophistry.
It seems clear that his visit to RC is purely for the purpose of generating material for his “act” at WUWT.
“Trolls” are commenters who want to deliberately waste your time, for their own purposes. They are not all rude and boorish. Indeed, those who bait their trolled hooks with “reasonableness” are often more successful.
Thanks for the link to SKS on Ira Glickstein, I couldn’t be bothered to read that in detail (same old boring mole-whacking) but scanning over doesn’t look good. The poor reasoning w.r.t. the 800 year lag issue is very bad – a systems engineer should be able to grasp the notion of something being a forcing in one situation and a feedback in another. It’s not hard.
Of course the “lag” is a silly notion. How could CO2 lead orbitally forced warming? Orbital forcing leads by definition. But then feedback sets in and CO2 becomes important.
Comment by Pete Dunkelberg — 21 Nov 2011 @ 3:43 PM
@ Pete Dunkelberg 262
Don’t worry. We’ll lose half of the worldwide crop, or “something”. I’m sure… that’ll… ummm… make them get their skates on…? But I’m sure the denialati will be saying 97% of climate scientists predicted global cooling at the start of the 21st Century.
It seems clear that his visit to RC is purely for the purpose of generating material for his “act” at WUWT.
And very obviously these reiterated injections of boring “low sensitivity” and so on arguments hinders people on dealing with the science related questions.
Somebody spoke about “chasing ones own tail”
In my coauthored book-in-progress on rising sea levels, I’m saying that the developed world now lacks and will continue to lack the political and therefore the financial will to adapt to sea level rise. In my opinion, it will take one or more sea level rise disasters to generate this will.
I don’t see the harm in engaging Ira. Yes, he is inconsistent in his presentations here and at WTFUWT, but 1)we gain nothing if we do not learn to counter the arguments of the denialists, 2)He was at least being civil and 3)He might learn something.
Pretty certain they haven’t been hacked again. It looks like more from the same batch, and again some of them sound damaging without context (and that’s how the deniers will publicise them… they are already).
There’s some desperate stuff in there too… For example Mann’s quoted as using the phrase ‘the cause’ several times, which of course naturally means he’s completely politically motivated and fraudulent and it has nothing to do with a noble science-based effort to stop the human race messing up our planet…
However, it’s once more a bit of a PR coup for the deceitful, ignorant, loony toons and their paymasters… Makes you want to weep…
Ray Ladbury wrote regarding Ira Glickstein: “He might learn something.”
That’s true. He might learn to refine his denialist propaganda to make it less blatantly bogus and more “reasonable” sounding, and thereby more effective at deceiving and misleading ill-informed people.
But this guy is not here to actually learn anything about climate science. He’s here to repeat his phony “lukewarmer” talking points, to reject and ignore any science that contradicts them, and to puff up his cred with the denier cultists at WUWT by telling them exactly what they want to hear about how “unreasonable” the “warmists” at RC are.
Comment by SecularAnimist — 22 Nov 2011 @ 12:05 PM
The good news is, there was a strong and immediate response from the University of East Anglia. And the timing is just too transparent — perhaps some lessons were learned by the media after all. One may dream…
Sorry this post is so long, but I am replying to a few RC readers.
From Ray Ladbury #226
Ira is here to be educated–as am I…as are you. Time will tell whether the learning curve of any of us has a positive slope. So far, he at least seems willing to listen.
Comment by Ray Ladbury — 17 Nov 2011 @ 10:22 AM
THANKS Ray, I appreciate the efforts by you and MARodger and the Moderators to promote serious cross-discussion here at RC of the issues where science-oriented individuals may differ.
From SecularAnimist #260
I nominate further posts from Ira Glickstein for the Bore Hole. …
Comment by SecularAnimist #260 — 21 Nov 2011 @ 1:02 PM
Sorry you feel that way, but I’ll continue to try to participate so long as RC Moderators allow.
From MARodger #257
Ira Glickstein @254. … comments you previously presented on this thread [are] riven with unsubstantiated contention, nonsense and ill-defined polemics. And strangely, you don’t exhibit the slightest concern by such characterisation. [Emphasis added]
William Cowper (~1763): “A moral, sensible, and well-bred man, Will not affront me, and no other can.” And, I appreciate RC allowing me to post here and others for responding to my skeptic comments.
… Indeed, you now have a website Response [by Gavin] accusing you of “conspiratorial fantasies” …
Comment by MARodger — 20 Nov 2011 @ 8:29 PM
No conspiracy, no fantasy, just a climate system and temperature data collection system so complex that honest, competent scientists and data analysts can only approximate the truth to a given level of certainty.
1) CO2 sensitivity may be 2ºC or 4.5ºC more than a factor of two (according to the IPCC and not my skeptic sources).
2) The US temperature record is uncertain due to corrections, mainly TOBS (Gavin’s Response). OK, I understand that Time of OBServation, site changes, and so on are issues.
3) GISS has had the 1998 US temperature data in hand since 1999, and has adjusted the anomaly from the initial 0.918ºC up by over 0.3ºC in several steps from 1999 to 2007 (Sato email). Since that date 1998 has gone up further and is now 0.4ºC over the value published by GISS in 1999.
[Response: You know why, and yet you keep insinuating it is some kind of mystery. Please stop. (For further information, all updates to the GISTEMP analysis - via new data inputs, adjustments to the algorithm, etc. are documented here. All the code is available here. An independent replication is available here). - gavin]
4) I can understand that 1934 data would be uncertain, but 1998 is only 13 years old and from stations that are mostly still operational, yet it seems to have analytic “wriggle room” of 0.4ºC.
[Response: The absolute anomaly is irrelevant since it is a function of the baseline (1951-1980) so any adjustments in past affect the more recent periods in the analysis. The '0.0' point is not meaningful. - gavin]
5) What was it about TOBS or other issues with the 1998 US data that became known between 1999 and 2001 (+0.281ºC)? I read the 2001 paper that Gavin linked and that partially explains TOBS and other areas where adjustments were made, but it is not clear to me why these were still issues after the work done by GISS up to and including the 1999 paper. But, OK, the 2001 paper seems to have nailed down all the necessary adjustments solidly.
6) It seems to me that after a +0.281ºC adjustment of the 1998 data (re-analysis between 1999 and 2001) the data would be pretty solid. Yet, between 2001 and now a further adjustment was made of +0.121ºC.
7) If estimated warming since the late 1800′s is about 0.8ºC, and TOBS and other corrections of 1998 data between 1999 and now is 2.81ºC + 1.21ºC which totals over 0.4ºC, there seems to be an uncertainty factor of two. Please note that my posts here at RC are not blaming incompetence or bad motives for these adjustments, merely observing that the climate and temperature data systems seem too complex to get closer than a fairly wide level of uncertainty.
Ira, I’m rather surprised that someone who was a system engineer could have such an unsophisticated view of how science is done! First, what matters is the trend, not the temperature in any single year–be that year 1934 or 1998.
Second,you are treating all climate sensitivity values between 2 and 4.5 as equally likely. They are not. They are not. 3 degrees per doubling is the most probable value, and the probability that the value is above 3 is significantly greater than that of a lower value.
You seem to think that somehow all errors will conspire to fall on the same side and the problem of warming will magically vanish–this despite the fact that we have multiple temperature products constructed from independent measurements that all show consistent trends in warming.
I would think that as a system engineer, you would want to take a conservative approach when it comes to risk. Instead, you seem to be engaged in a contest with yourself to paint the rosiest picture you can. Rose may be a beautiful color, but perhaps you would understand the issue if you viewed it through lenses with less tint.
This is all stale stuff and a bit removed from the actual science, so I’ll draw your attention to an item from the Santa Fe conference
Was this taken seriously ? http://www.vukcevic.talktalk.net/SantaFe2011.htm
I hope not !
Pal Brekke , H. Abdussamatov, Lockwood any comments ?
For one thing, it is a simple, low-cost technological innovation that can generate ten times the energy per square meter as conventional wind turbine farms, and is safer for birds, and as the article notes, “because they’re quieter and smaller, they can be distributed more widely and can be built closer to population centers.”
For another, it’s a very interesting story about fortuitous scientific insight: the principles that were applied to develop the vertical turbine arrays were derived from observations of regularly-spaced vortices in the wakes left by schools of fish as they swim.
Huhne says influential Global Warming Policy Foundation is ‘misinformed’, ‘wrong’ and ‘perverse’ following GWPF report
The GWPF has repeatedly called for more openness from scientists on research into climate change. But the Guardian has also discovered that Peiser has refused several freedom of information requests himself, leading to accusations against the foundation of double standards and secrecy about the thinktank’s mystery funders.
Huhne’s letter is a response to a report sent to him by former chancellor Lord Nigel Lawson, who chairs the GWPF, and Lord Andrew Turnbull, a former head of the civil service and a GWPF trustee. The GWPF report “questions blind faith in climate alarmism” and claims there is “huge controversy about the relative contribution of man-made CO2 versus natural forces”.
But Huhne replies: “Let me say straight away that [I] believe that you have been misinformed and that your conclusions are poorly supported by the underlying science evidence.” He goes on to say: “It would be perverse to ignore this well attested and thoroughly reviewed body of evidence.”
Huhne tells Lawson and Turnbull: “It is not true to say that UK climate change policy relies on a single source of evidence,” and that “you wrongly assert that the UK is taking unilateral action” in tackling climate change.
In conclusion, Huhne writes: “The scientific case for action is robust. We would be failing in our duties to pretend otherwise and we must with other countries take the actions necessary to protect our planet from significant climate change.”
Ira Glickstein @270 2nd paragraph. My part in this was never “to promote serious cross-discussion here at RC.” I was only concerned that ‘what happened to you’ was not seen as some underhand act, just as I was concerned by comments passing at WUWT behind the backs of those engaged here in comment at RealClimate. Whether this exchange continued here or not never my concern.
@270 4th paragraph. If you’re going religious on this one by quoting Cowper, think Matthew 5:39. I see no sign of a turned cheek. Or an offered coat. When challenged to source your 0.5 to 1.0 deg C quote and your replies were dismissed as unsubstantiated contention, you end it by going silent on the matter. I certainly would consider it discourteous to cut off such an exchange just because it gets difficult. You apparently think the opposite.
And what do I then espy @270 1)? Off you go again talking of “…my skeptical sources.” Ira Glickstein – either name your sources or define your remarks differently.
I would also chip in @270 7). At WUWT Ira Glickstein PhD is on record as asserting that 0.3 deg C of the 0.8 deg C is due to poor siting of stations & UHI etc. There was not any suggestion in these WUWT statements of uncertanty other than a gross exaggeration of global temperature rise.
Earlier today I asked whether American news outlets would do their due diligence in evaluating the content of the newly-released batch of “Climategate” emails hacked from the University of East Anglia two years ago. It didn’t take long for our esteemed print outlets to disappoint…..”
1. Axing the NCS didn’t save any money.
2. The NCS would increase government efficiency.
3. House Republicans based their opposition to the NCS on nothing more than climate denial.
4. Current services are already falling behind demand.
5. Not having an NCS keeps benefits from accruing to large industries, small businesses, and farmers.
6. The NCS would stimulate investment and economic growth.
7. The NCS would help businesses and communities become more prepared and resilient.
8. The NCS would help firefighters predict extreme fire and drought seasons.
9. The NCS would strengthen our national security.
10. A few House GOP extremists derailed a proposal with widespread support.
I’m an ordinary Jo who travels alot building computer systems for a wide number of uses. I have an Msc so I am not stupid nor am I a world class interlect, but I had an interesting conversation while working in China a few months ago.What it boiled down to was a couple of General’s and some business men toasting western climate sci’s for making them very rich ,they love co’2 taxes as it moves more business to China and less tax rev to the west. Now I don’t have kids and after 40 odd years here I don’t really like people that much, so I don’t give a rats arse if everyone boils the day after I’m dead (sorry).
My math is pretty good and I can tell you a fact (based on UN figs),global warming does’nt matter as far as your kid’s are concerned ( if you don’t stop China from taking the piss) they won’t have a job,house or kids of their own.They will live in a third world hell hole and hate this generation for the neglect of their fiscal future (unless they moved to China).So don’t use the next generation as a reason until you stop those countries that don’t care about a couple of million dead citizens profiting from your quick fix solutions to a long term problem, cutting your nose off to spite your face still don’t work chaps.
“Now I don’t have kids and after 40 odd years here I don’t really like people that much, so I don’t give a rats arse if everyone boils the day after I’m dead (sorry).“
So by your logic we can safely ignore everything you say, then. Right?
Because if you actually don’t care then you wouldn’t bother coming here to make such an inane statement. But you did. Therefore, you DO care. Since you DO care, then, it must be an underlying ideology that drives you to foment yet more delay on actions being implemented to curb CO2 emissions.
“Please don’t take action on curbing CO2 emissions because that will cost me money; I don’t give give a rats arse about anyone other than myself.“
Data debate: is transparency bad for science?
A debate to launch the new issue of Index on Censorship’s magazine, ‘Dark Matter: what’s science got to hide?
06 Dec 2011 Lecture Theatre 220, Mechanical Engineering Building, Imperial College
Scientific data is more freely available than ever. But does the push for openness help or hinder science? A debate to launch the new issue of Index on Censorship magazine, ‘Dark Matter: what’s science got to hide?’
• Sir Mark Walport, Director of the Wellcome Trust
• George Monbiot, author and Guardian journalist
• Baroness Onora O’Neill, House of Lords
• Chair: Jo Glanville, Editor Index on Censorship
Are you claiming that early homo sapiens, over 100,000 years ago, killed off most of the large animals, and it was not due the cold climate? – Ira Glickstein
No. You specified “the last few hundred thousand years”, which includes the last 50,000. The extinction of megafauna on an unusual scale is AFAIK confined to the last 50,000 years, during which – until recently – nothing very different happened in climatic terms from what had happened in the preceding few hundred thousand*. Moreover these megafaunal extinctions have happened at different times on different continents, and those times have been shortly after the arrival of anatomically modern humans – first Australia and New Guinea, then northern Eurasia, then the Americas.
* Of course there were many local extinctions due to climatic change, but by and large the megafaunal species were able to track the climates that suited them, able to do so because the change was slow, relative to what we are now imposing.
Foreign investors aren’t just after land in Africa. Access to water is essential – which can bring them into direct competition with the needs of local communities
It is no coincidence, observers say, that the most aggressive foreign investors are also those facing water shortages at home. This year, risk analysis firm Maplecroft said the results from its water stress index showed why India, South Korea and China, along with the oil rich Gulf states, are racing to buy land in developing countries and grow crops abroad. The chairman and former CEO of Nestlé, Peter Brabeck-Letmathe, has gone so far as to say the global rush for farmland is actually a “great water grab”. He writes in Foreign Policy: “With the land comes the right to withdraw the water linked to it, in most countries essentially a freebie that increasingly could be the most valuable part of the deal.”
Masai living in Kojiado district, near the Tanzanian border, are finding traditional cattle herding harder because the weather is getting hotter and the rain more unpredictable. Pasture is becoming harder to find, so many are diversifying to grow crops as well or turning to farming full-time. They are also working with traditional farmers to ensure that both lifestyles can share the resources available without encroaching on each other’s survival. Sending children to school is also becoming a priority so that they are better equipped for their changing world
Ira, I’m rather surprised that someone who was a system engineer could have such an unsophisticated view of how science is done! First, what matters is the trend, not the temperature in any single year–be that year 1934 or 1998.
Correct, Ray, it is the long-term trend that tells the true story. I picked 1934 and 1998 only because they were the topic of the Sato email, and because they illustrate the level of uncertainty in analysis of the thermometer record, even for a set of data a only a bit over a decade old.
The TOBS and other adjustments mentioned by Gavin are documented in the 2001 paper he linked. See page 18 for graphs of USHCN Adjustments. Note that years prior to 1960 are reduced in temperature by up to 0.1ºC while those after 1970 are increased by up to 0.2ºC which adjusts the 1900 to 2000 trend by up to 0.3ºC. Further trend adjustments of around 0.1ºC in the same direction have been made to the same old data between 2001 and now.
I do not claim the adjustments were unjustified, only that competent analysts can get different trends when they re-analyze the same data, and that the level of uncertainty, up to 0.4ºC, is within a factor of two of the total estimated warming.
Second, you are treating all climate sensitivity values between 2 and 4.5 as equally likely. They are not. They are not. 3 degrees per doubling is the most probable value, and the probability that the value is above 3 is significantly greater than that of a lower value.
Where I come from, a range of values usually indicates the +/- 1σ (Greek letter sigma, for standard deviation) for that variable. So, 2ºC and 4.5ºC would be equally likely, with, as you say 3ºC the most probable. But, I do not agree that values above 3ºC are more probable than those below. On what basis do you reach that conclusion?
You seem to think that somehow all errors will conspire to fall on the same side and the problem of warming will magically vanish–this despite the fact that we have multiple temperature products constructed from independent measurements that all show consistent trends in warming.
I would think that as a system engineer, you would want to take a conservative approach when it comes to risk. Instead, you seem to be engaged in a contest with yourself to paint the rosiest picture you can. Rose may be a beautiful color, but perhaps you would understand the issue if you viewed it through lenses with less tint.
Comment by Ray Ladbury — 22 Nov 2011 @ 2:25 PM
When I wrote proposals to get funding for R&D projects I always took the most optimistic, rosy tinted view of the probability of success, without lying. On the other hand, when I reviewed proposals from marketeers trying to sell us products, I took a more pessimistic view.
Even in engineering, where we were dealing with actual products, such as Doppler and Inertial navigation systems and digital computers and software, it was common to have up to a 2:1 range of uncertainty. I personally made estimates of data rates, storage capacity, and schedules that I thought were conservative that turned out to be optimistic when implemented.
I suspect analysis and predictions for the climate system and thermometer record for the past century is even more uncertain. We are being asked to take drastic, government-enforced action that may damage our economy. Just today, there were news stories about how the Ethanol mandate (which I favored at the time) has doubled grain prices, making a wide variety of food products less affordable to ordinary people. In engineering, we try to balance cost and benefit in a conservative way. We know that advanced technology will will be our bread and butter in the coming decade, but cannot be sure which specific advances will win out.
Ira Glickstein @270 2nd paragraph. My part in this was never “to promote serious cross-discussion here at RC.” I was only concerned that ‘what happened to you’ was not seen as some underhand act, just as I was concerned by comments passing at WUWT behind the backs of those engaged here in comment at RealClimate. Whether this exchange continued here or not never my concern.
Had you not been monitoring my topic at WUWT and intervened here at RC, I do not think my re-posting would have been passed, but I could be wrong. In any case, I appreciate your participation in this ongoing serious cross-discussion here at RC.
@270 4th paragraph. … When challenged to source your 0.5 to 1.0 deg C quote and your replies were dismissed as unsubstantiated contention, you end it by going silent on the matter. I certainly would consider it discourteous to cut off such an exchange just because it gets difficult. …
I cited Lindzen/Choi (2009 and 2011) and Spenser as sources for CO2 sensitivity of 1ºC or less. I noted that Lindzen/Choi 2011 admitted errors in their earlier paper, said they had corrected them in their 2011 paper, and that their conclusions remained about the same with regard to overestimation of sensitivity. Beyond that, I have nothing to add, except that in all my comments regarding sensitivity, I have consistently noted that the IPCC estimates range from 2ºC to 4.5ºC.
At WUWT Ira Glickstein PhD is on record as asserting that 0.3 deg C of the 0.8 deg C is due to poor siting of stations & UHI etc. There was not any suggestion in these WUWT statements of uncertanty other than a gross exaggeration of global temperature rise.
Comment by MARodger — 22 Nov 2011 @ 4:47 PM
Yes, at WUWT I wrote a few postings that allocated the 0.8ºC estimated warming to what I called Natural, Data Bias, and Human-Caused. My original allocation was 0.4ºC, 0.3ºC, and 0.1ºC, respectively. WUWT commenters provided their estimates, and I averaged them, and the estimates were revised to 0.33ºC, 0.28ºC, and 0.18ºC.
I have some thoughts on accelerating sea level rise and the recently published paper by Stefan Rahmstorf, Mahé Perrette & Martin Vermeer: “Testing the robustness of semi-empirical sea level projections”, Clim Dyn (online first).
(A paper that deserve its own post on Real Climate I think.)
As many of you probably know, Hansen & Sato’s (2011) proposed that the mass change of the large ice sheets could actually be following an exponential development, with a best fit of a doubling time of 5-6 years for the annual mass loss of ice sheets from Greenland and Antarctica (see figure 8 in Hansen & Sato and their discussion in section “6. Sea level”).
Now, Rahmstorf, Perrette & Vermeer (2011) discusses the level of acceleration in their paper, and say “a non-linear response of ice sheets could make semi-empirical projections overestimate or underestimate the true future sea level rise.” (Section 10). They then refer to the mass-balance assessments by Rignot et al (2011), which “found that the rate of sea level rise [from the ice sheets in Greenland and Antarctica] has been increasing approximately linearly from 1992 to 2010 while global temperature has also been increasing linearly.” (Section 10). Rahmstorf et al then draw the conclusion that “at least until 2010 the observed ice sheet mass loss is fully consistent with the semi-empirical projections and shows no sign of an important non-linearity in its response to warming.” (Section 10)
The problem I have with the statement that it “shows no sign of an important non-linearity” is that the empirical data may ALSO be consistent with an “important non-linearity”. At least Rignot et al. (2011) do not test the hypothesis of a non-linear acceleration of sea level rise. They only fit the empirical data against a linear fit. Hansen & Sato (2011), on the other hand, makes an exponential fit to the data of ice sheet mass balance by Velicogna (2009), which looks at least as plausible to me.
Now, it may be very hard or even impossible to say if it is a linear or non-linear acceleration based on a such short interval (<20 years), but this means that we cannot draw the conclusion that the empirical data says that it is NOT non-linear acceleration.
Neither Rignot et al (2011), nor Rahmstorf et al (2011) evaluate the possibility of non-linear acceleration of sea level rise. Given the extremely high impacts of a faster sea level rise, I think it needs to be seriously considered.
Finally, I think that the fact that modelling community were unable to predict the recent drastic decrease of the Arctic sea ice area and volume should make us humble and warrant precaution for future projections. There is also the question if there could be a connection between the two. For example, where will the energy that has been melting all the Arctic sea ice go when all sea ice is melted? Could it go to the ice sheets? (Wade Smith made this point over at Skeptical Science, who after some back-of-the-envelope calculations arrive at a similar figure of exponential ice sheet mass loss as Hansen & Sato. See the comments at: http://www.skepticalscience.com/sea-level-rise-predictions.htm)
Ok, this is my first post here, and I would love to hear your thoughts on this!
Hansen, JE, Sato M (2011) "Paleoclimate implications for human-made climate change." In Climate Change at the Eve of the Second Decade of the Century: Inferences from Paleoclimate and Regional Aspects: Proceedings of the Milutin Milankovitch 130th Anniversary Symposium. A. Berger, F. Mesinger, and D. Šijači, Eds. Springer, in press. http://pubs.giss.nasa.gov/abs/ha05510d.html
To those who feel the presence of Ira Glickstein is inappropriate, I should perhaps apologise for my part in his being here (as alluded to @284, although his invite was not mine).
Ira Glickstein @284
I would not use the adjective “serious” to describe these exchanges. You already have my objections @249 to the two (now three) sources you give for “some skeptics” deriving sensitivity as “0.5 to 1.0 deg C.” Re-presenting them afresh @284 changes nothing. And I am not sure what is gained by you ‘consistently noting’ the IPCC have findings (that contradict your “some skeptics”) when you then dismiss the IPCC work regardless.
As for your “allocations” of the GISS global temperature record, it appears you hold the bizarre notion that science can be democratic. Pure nonsense! 2+2=4. There is no room for fallacious opinion. To believe that 2+2=4.022 is just as wrong as believing 2+2=22. To pillage a common saying, in science it is much more than “two wrongs don’t make a right.” In science “right” can only exist because, as much as is feasible, all the “wrong,” every last bit of it, has been identified and junked.
The level of evidence that allows the junk to be junked could be the point at issue here However you have provided no sources for your sensitivity assertion. If you had pointed to a source (hypothetically, say Black & White 2012) that said “…and we find global sensitivity to be in the range 0.5 to 1.0 deg C” then judgement could be made. Without such evidence or something amounting to such evidence, your statement is unjudgeable. It is unsubstantiated and thus it is so much junk. This exchange is thus simple time wasting. And that is the reason some folk object to your presence here.
Professor Colin Prentice from Macquarie University says he is not surprised by the results. (snip)
“What it means is we can be a bit more sure about the sort of range of temperature changes that will result from the given change in the amount of fossil fuel and CO2 and other greenhouse gases,” he said.
“The key point is that there has been ongoing buzz about the possibility that the climate sensitivity may be way, way higher than in mainstream climate models.
cat, bag, out?
wondered why this paper got everyone so excited, when it was within IPCC CS range…
Ira Glickstein wrote: “We are being asked to take drastic, government-enforced action that may damage our economy.”
In short, you don’t like the solutions that you believe some people have proposed, therefore you deny the existence of the problem.
To which end, you distort and misrepresent the science, gullibly embrace the pseudo-science of the pseudo-skeptics, ignore anything that contradicts your predetermined conclusions, repeat long-debunked talking points over and over, and of course, cast aspersions on the integrity, honesty and competence of the worlds finest and most dedicated scientists — including the moderators of this site, whose forbearance in permitting your repetitively disingenuous comments you have rewarded by complaining about your imagined mistreatment here in your postings at WUWT.
That’s the motivation and behavior of a stereotypical, run-of-the-mill denier. There’s nothing the least bit interesting or challenging about any of it.
Those who are responding to you with substantive comments, as though you had either the interest or the capacity to learn something about the science, have the generous spirits and the patience of saints.
After all the poor science journalism and opinion on Schmittner’s paper he has stated the following.
“While our statistical analysis calculates that high climate sensitivities have very low probabilities, you can see from the caveats in our paper, and my remarks in this interview, that we have not actually claimed to have disproven high climate sensitivities….Our study comes with a number of important caveats, which highlight simplifying assumptions and possible inconsistencies. These have to be tested further.”
I noticed that one of your specialities is the ocean-atmosphere coupling. I have looked into the same subject but concentrated on the North Atlantic and came to a surprising result: http://www.vukcevic.talktalk.net/theAMO.htm
I would be interested to here your opinion, either openly (website) or via email.
Thank you SteveF #287. James Annan made a couple of interesting points and it will be interesting to read what others say, especially about the differences between climate sensitivity on land vs ocean, and findings of temps in the last glacial maximum.
Ira, Have you not heard of a lognormal distribution? When a distribution has a positive skew, there will be more probability to the right of the mode than to the left. A negative skew (e.g. a Weibull with shape parameter >3.6) yields more probability to the left of the mode. The probability distribution for climate sensitivity is strongly skewed right. Frankly, that is one thing that is very interesting about the latest estimate from Science–although that conclusion seems to derive from their ocean sensitivity estimates.
As to actions required to avoid severe consequences due to climate change, I think that you overestimate the cost–particularly since most of those actions (e.g. a sustainable energy infrastructure) are needed in any case to avoid shocks of ever higher fossil fuel prices.
And on ethanol, that was never a climate or energy program, but rather a disguised agricultural subsidy. Modern agricultural is merely a sophisticated way of turning petroleum into something we can eat (e.g. corn and soy). Now we’re going to turn that back into a fuel and substitute it for petroleum? Show me a system that does that, and we’ll share a Nobel for overturning the 2nd law of thermo.
Comment by Edward Greisch — 25 Nov 2011 @ 10:02 PM
Mr. Per Wikman-Svahn writes on the 25th of Nov 2011 at 9:25 AM:
Re:Supralinear or exponential acceleration of sea level rise
Sea level rise comes from thermal expansion (arguably linear), export from land based aquifer to ocean (almost zero), and ice melt. Of these, the third seems the most plausible candidate for nonlinear increase. It is hard to see how the ocean heat could penetrate GIS other than through giant rainstorms, but WAIS shows us how fast breakup can go on retrograde seabed. In this regard the Pollard and DeConti results are both illuminating and scary.
Lead author Andreas Schmittner from Oregon State University, US, explained that by looking at surface temperatures during the most recent ice age – 21,000 years ago – when humans were having no impact on global temperatures, he, and his colleagues show that this period was not as cold as previous estimates suggest.
“This implies that the effect of CO2 on climate is less than previously thought,” he explained.
Climatologist Andrey Ganopolski, from Potsdam Institute for Climate Impact Research, Germany, went further and said that he would not make such a strong conclusion based on this data.
“The results of this paper are the result of the analysis of [a] cold climate during the glacial maximum (the most recent ice age),” he told BBC News.
“There is evidence the relationship between CO2 and surface temperatures is likely to be different [during] very cold periods than warmer.”
Scientists, he said, would therefore prefer to analyse periods of the Earth’s history that are much warmer than now when making their projections about future temperatures. http://www.bbc.co.uk/news/science-environment-15858603
Duh…. the last cold snap to conclude about CS is more than a stretch! Although current CS estimates (3C) are maybe even an underestimation!
And great timing for this “study” to appear and a bit lame of the BBC to not make it more clear that this study is rather weak!
November 25, 2011, 12:29 p.m.
Renewable energy is surpassing fossil fuels for the first time in new power-plant investments, shaking off setbacks from the financial crisis and an impasse at the United Nations global warming talks.
Electricity from the wind, sun, waves and biomass drew $187 billion last year compared with $157 billion for natural gas, oil and coal, according to calculations by Bloomberg New Energy Finance using the latest data. Accelerating installations of solar- and wind-power plants led to lower equipment prices, making clean energy more competitive with coal.
Oil near $97 a barrel Oil near $97 a barrel
Biofuels, wind power show gains, but hurdles remain Biofuels, wind power show gains, but hurdles remain
Energy secretary refuses to apologize over Solyndra loan Energy secretary refuses to apologize over Solyndra loan
“The progress of renewables has been nothing short of remarkable,” United Nations Environment Program Executive Secretary Achim Steiner said in an interview. “You have record investment in the midst of an economic and financial crisis.”
The findings indicate the world is shifting toward consuming more renewable energy even without a global agreement on limiting greenhouse gases. …”
Go to americanselect.org/
Americans Elect is a place to nominate 3rd party candidates and campaign for questions to ask candidates. Please go there and campaign for GW questions and candidates. Let’s nominate Mike Mann for president and Gavin Schmidt and Ray Pierrehumbert for the US senate.
303 prokaryotes: I hope you replied that to dotearth.
400: My evidence for the claim that “Most of the people who ‘believe’ in AGW don’t know what it is that they believe in” is purely anecdotal, but it’s the only empirical evidence I have the authority to offer. It’s based on 20+ years of regularly reading journalists and columnists failing to explain ‘the’ science clearly, and creating space for madness; 20+ years of listening to highly educated non-scientists of many shapes and sizes at 1 leading UK independent school, 4 different universities in 3 different countries, and various other theatres along the way, in four different continents, opining, or shrugging they’re shoulders and going with the flow; but above all it’s based on being raised in an academic family, surrounded by climatologists and evironmentalists who taught me to be cautious, and by working in a major university where one can talk to four different scientists at lunch, none of whom might be deniers but /all/ of whom have some very different views on what it is that one should be most cautious about, and what it is that it is safe to say. I would not expect most people to be able to explain how their TV set works. Is it all that outlandish to suggest that the foundations of faith among the non-scientific (you know: /most/ people) may not actually be all that flattering for scientists? Don’t you think this might — just a wee bit — help explain why we are where we are?
401 (Frank): You say “if you keep using the ‘um, I’m not a scientist, this stuff’s hard to figure out’ excuse, then it starts to look like, you know, an excuse”. You don’t know what I think about the science. I didn’t tell you. What am I excusing? Your example has nothing whatsoever to do with science: common sense suffices for cutting through the storm of bullshit that surrounds us. But that’s precisely the problem. The battles increasingly have nothing to do with the science. It simply is not good enough to sneer tell people to get an education. I can get a pretty good one in my neck of the woods — second to none. And it really is a bit confusing, unless you read the stuff trying to simplify things to make stuff happen in policy-making. What proportion of adult people who have heard of ‘Global warming’ do you /really/ think have approached the matter with a completely open mind, scientifically, from first principles, and thus by the proper approach agreed that the IPCC the best possible account of where we are? Did you? Really? In the same way that you read every single word of every party manifesto before you vote?
BTW, sorry for the repeated dose of the previous meandering. Accidentally sent prematurely, and then reprised in vexation…
Ira, Have you not heard of a lognormal distribution? When a distribution has a positive skew, there will be more probability to the right of the mode than to the left. A negative skew (e.g. a Weibull with shape parameter >3.6) yields more probability to the left of the mode. The probability distribution for climate sensitivity is strongly skewed right. …
You, I, and the IPCC agree that, given limits of 2.0ºC to 4.5ºC; 3.0ºC is the most probable value. If we take the exact midpoint between 2.0 and 4.5, we get 3.25ºC. Therefore the peak of the distribution, at 3.0ºC skews to the left of 3.25ºC (i.e., to the left of the midpoint between the limits).
That result is easy to calculate. LOG10(2.0) = 0.30103 and LOG10(4.5) = 0.653223. Average them and get 0.477121. The inverse log of 0.477121 = 3.0.
Thus, it appears that values closer to 2.0ºC are slightly more likely than those closer to 4.5ºC, yet you say the probability distribution “is strongly skewed right”, which seems incorrect because the skew is neither to the right nor strong. Please explain.
Continuing Ray #300:
Frankly, that is one thing that is very interesting about the latest estimate from Science–although that conclusion seems to derive from their ocean sensitivity estimates. …
You appear to be referring to a paper in the prestigious journal Science. Hank Roberts (#289) linked to an interview with a co-author that includes the graph I think you are referring to. (It is the third figure down in the link.)
The Green (Land) curve peaks around 3.5ºC and extends from about 1ºC to 5ºC. The Blue (Ocean) is multi-modal (multiple peaks) with the highest around 2.2ºC and extending from 1ºC to a bit over 3ºC. The Black (Land & Ocean) has two large peaks around 2.2ºC and 2.5ºC, with minor peaks at around 2.9ºC, 1.4ºC, and 1.7ºC. All three curves skew to the right, which supports your contention that higher values are more likely than lower.
The co-author states CO2 sensitivity “…is “likely” (66% probability) to lie between 1.7 and 2.6 degrees, and “very likely” (90% probability) to lie between 1.4 and 2.8 degrees, with a best estimate of around 2.2 or 2.3 °C.”
As I said earlier in this thread (#284) “a range of values usually indicates the +/- 1σ (Greek letter sigma, for standard deviation) for that variable.”
The +/- 1σ range of the normal curve contains about 68% of the area of the curve, which includes the “likely” limits of the range, and the +/- 2σ range contains about 95% of the area, which includes the “very likely” limits. (The co-author of the Science paper is quoted as saying 66% and 90%, close but not identical to +/- 1σ and +/- 2σ, but he may have been misquoted.)
This result, if correct, tells us that CO2 sensitivity is significantly different over land and ocean and there also seem to be different peaks associated with NH and SH (more land in NH than SH) and perhaps tropics, temperate, and polar regions.
Continuing Ray #300:
… And on ethanol, that was never a climate or energy program, but rather a disguised agricultural subsidy. …Comment by Ray Ladbury — 25 Nov 2011 @ 9:35 PM
On this you are absolutely correct! When government gets involved, the strongest special interests will usually prevail, in this case Big Agriculture.
Ira Glickstein, from what you wrote just above, it seems you looked at the picture on the page I cited, but didn’t understand the accompanying text.
Look at the text again — it’s not intuitively obvious what “the ‘fat right tail’ of the climate sensitivity distribution” means, and they assume the reader has some idea what they’re talking about. You don’t yet.
It’s not intuitively obvious why those various curves appear in “the IPCC’s climate sensitivity range” discussions, or what they mean. Again the authors there assume the reader has a lot of background that’s still ahead of you. You can get there, but lecturing about how it’s done where you came from isn’t helping.
James Annan’s several posts and papers offer a critique he and others hope will improve the _next_ IPCC Report’s explanation.
You won’t understand if you insist words have your personal meanings. Climate sensitivity isn’t a normal distribution. 2xCO2 isn’t twice today’s value. Likely other assumptions you’re making are also inconsistent with the subject you’re trying to discuss.
Find out what the words mean in the subject you’re trying to learn. It’s the same chore for all of us, the FAQ section linked at top of page helps.
“… you will often see estimates of the uncertainty in T2x expressed as a range between a low value and a high value. For example, the most recent assessment of the International Panel on Climate Change states that the “likely” range is from 2 to 4.5 degrees Celsius. But here’s where things get more complicated and more worrisome: That range is itself not well-defined. More extreme results (outside this range) cannot be ruled out and therefore just focusing on those two numbers can be misleading.
A more comprehensive way of expressing the uncertainty range in T2x is in terms of its so-called probability function …. The figure below provides a rough illustration of what that plot typically looks like — the larger the value of the plotted probability density for a given value of T2x, the more likely that T2x is….” http://nicholas.duke.edu/thegreengrok/graphics/feedbacks-graph
Many of us regular nonscientist readers here try to hang on to help new readers find the basics, and to help silent readers watching learn from the questions and assertions others put into the text — many of them repeated over and over.
Comment by David B. Benson — 26 Nov 2011 @ 8:46 PM
Hank Roberts #309: Thanks much for the links, particularly the second one about the possibility of a “fat tail” to the right. That helps me understand what Ray was getting at a couple rounds ago.
Please straighten me out on the effect of a different starting point for T2x. I was under the impression that it applied to any reasonable starting point. In other words, the expected equilibrium temperature rise going from the pre-industrial 270 to 540 ppmv is about the same rise as going from the current 390 to 780 ppmv, all else (natural variations, etc.) being equal. Is that correct, and, if not, why not?
Also, when I see a multi-modal curve, such as the third figure in the interview with the co-author of the Science paper you linked to, I assume it means that more than one “population” is involved. For example, if someone did a random survey of heights of people, and ended up with a peak at around 3 feet and another one at around 5.5 feet, I would assume the survey was done on a mixed population, such as grade school students and the parents who were picking them up.
Does that mean that the Science Land & Ocean curve, which has five fairly well-defined peaks, is telling us that T2x varies significantly between Land and Ocean, and within Ocean between the tropics, temperate, and poles? If so, how does that change our predictions as Mauna Loa CO2 levels continue to increase 2 ppmv or more each year?
I appreciate that each domain of application of statistical analysis has its own lingo, and that my experiences in Avionics systems may lead me to make assumptions that are unjustified in the domain of climate science.
I would agree with you that most people take AGW on trust although to say they “…don’t know what they believe in” suggests there trust should be better grounded. Your comment (and @397 on previous thread) seemed to apply this to the general public as (@397) you go on to liken this ‘not knowing’ to the disinterested and the deniers. The general public’s ignorance of AGW is not unique. The public are also very good at ignorance in most sciences and many other areas of quite basic learning. So in terms of the public, I fear your comment is not saying much and your criticism of scientists in this regard is misplaced.
More narrowly, your comment is perhaps less true but more pertinent. With folk who make AGW their business – activists, educators, the media – there is often more ‘enthusiasm’ than knowledge on offer. I wouldn’t say this applies to “most” of these people. It probably applies to “too many” of them.
But in their defence these folk have been dealt an awkward card. AGW is about climatology which deals with an exceedingly high level of complexity and an uncertain future. And that is not all. AGW also deals with the present human activities causing AGW, the technology available (or not available) to combat AGW and their impact on our future society/economy, plus the political processes that can make it all happen.
So if I encounter an activist who totally misinterprets, say, the economic impact of the UK’s Renewable Obligation Certificates I do not lay into them for being out of their depth (although I likely would if they were of the “skeptical” school of thinking). AGW is such a large pond we are all out of our depth somewhere or other so some level of trust, of not actually knowing what we “believe in”, this is required by all of us.
Of course there are many activists etc who entirely ignore being out of their depth and worse, entirely ignorant that the people they are believing/trusting in are also out of their depth – that is far less forgiveable and that is why I will lay into such folk.
Ira, I take it from your post that you don’t have much experience with probability distributions other than Normal. That is not unusual. I am in the process of preparing a lecture on various distributions encountered in radiation and parts engineering.
Do you remember from probability class the concept of a distribution’s moments. Integral xP(x)dx is the first moment–aka the mean, xbar. The variance measures width and is related to the second moment centered on the mean:
integral P(x)(x-xbar)^2 dx.
The skew is related to the third moment centered on the mean and normalized to the variance/standard deviation. It measures which of the distribution’s tails are thicker. Why do we care about thick tails? In a thick tailed distribution, the probability of x goes to zero much more slowly than would a normal distribution. Now if the right tail is thicker than the left, that means the mode of the distribution must be to the left of the mean–and much more of the probability may be on the right side of the distribution than the left. This corresponds to positive skew. The situation is reversed for a negatively skewed distribution–you get the mode to the right of the mean and more of the probability is to the left of the mode. CO2 sensitivity has a pretty strong rightward skew–I think David Benson and I fit a histogram of the estimates one time and got a lognormal standard deviation of about 0.35–sound about right, David?
The fourth moment is related to the kurtosis, which measures the relative amounts of the distribution in the peak and in the tails–positive means heavy tails (relative to the normal)and negative implies the tails go to zero more rapidly than a normal.
There is a whole branch of probability that looks at the behavior of the tails of a distribution–Extreme Value Theory. It proves that the extremes either go to zero at some finite value, approach zero exponentially or approach zero as a power law.
Nerd humor alert. The Cauchy distribution approaches zero approximately as x^-2, so none of the moments exist (e.g. they all diverge). What is the Cauchy distribution’s least favorite question? “Got a moment?” ba-dum-dum
Hey Guys, let’s cut Jon some slack. I think he’s trying to play good cop. I don’t think he meant to imply that the average RC denizen doesn’t know why they believe the science–but RC’s clientele is hardly average for the population. A lot of folks may not fully understand the science, but assume that the people who have made it their life’s work to understand climate might. They realize they aren’t experts, so they are at least not subjects for a Dunning-Kruger study.
On the denialist side, we have mainly Dunning-Kruger superstars and ideologues who reject the science because their ideology doesn’t have an answer for it.
On multimodality: The causes of multimodality are complicated. Sometimes, you do have two distinct populations. Sometimes, however, the response of a single population is determined by competing effects. Subtle changes can affect which effect dominates and result in very different (e.g. multimodal) response even though the characteristics of the population are changing continuously.
I saw this recently in some test data for radiation degradation in some transistors–as long as the collector-to-emitter current was high, you had a single population. However, when the C-E current dipped down into the hundred-microamp range the population split into two (at least) modes.
The Schmittner study looks interesting. It is clear that the lack of a thick, high-end tail depends on the ocean reconstruction, and that is very complex. The ocean estimate is also where I’d expect the largest errors, since the response of the oceans is much slower than that of land. Since 3/4 of the planet is oceans, it is not surprising that it dominates. However, since the predominant negative feedback in the climate system is thermal IR radiation, it seems odd to me that you’d get drastically different sensitivities
That’s the hypothetical case — where a megalomaniac climate scientist experiments by instantaneously doubling the amount of CO2 in the atmosphere, holding all else the same, and then waits out the resulting centuries of temperature change to see where the resulting temperature rise levels off.
So you’re beginning to question your assumptions rather than state them as facts, and to realize you’ve been asking FAQs here while lecturing elsewhere based on mistaken assumptions.
This isn’t unique to you. Most of the folks who answer the basic FAQs here over and over aren’t working climate scientists, we’re readers who got here a few years before you did. For myself a lot of what I knew was wrong, and a lot of the science is beyond me, and so I try to find simple clear words to paraphrase what I read, and point to sources, and keep up with the research abstracts at least, and try to remember that new research doesn’t overthrow, it extends (maybe) what we know.
That’s what the IPCC is doing to, on a somewhat larger and more professional scale, eh?
You’ve picked one of the big questions. The more detailed the discussion the more you need a page of your own (by custom this November ‘Unforced Variations’ will roll over to a new one for December after a while).
If you start reading and checking your assumptions it’ll take you a while — took me several years and I doubt I wasn’t particularly slow learning (and unlearning what I thought I knew).
You won’t find me trying to explain probability and climate sensitivity; at most I can point to writers I’ve learned to trust by verifying what I could follow and trusting others’ trust where I couldn’t.
Read James Annan’s blog on the subject and his papers and discussion criticizing that same chapter and think about how it ought to be written for the AR5.
Read topics and FAQs here — use the “Start Here” button and the search box.
Threads like this are good for answering wrong assumptions people come in stating. Many of us have typed much of this repeatedly. Once you know to check your assumptions, and how to find the published work, you can go beyond these open threads, I’d suggest, and ask better-informed questions.
You are getting answers from real working scientists here; pay attention to their answers. Me, I’m just some guy pointing to the FAQs and sources.
MARodger. The arguments about activists and educators not being fully up to speed on climate science apply equally to all general advocates and educators on items like public health, for example. We don’t expect Ms Jones from the Local Health Group who goes around doing presentations on heart health or allergies or skin cancer or managing diabetes to be fully trained as an endocrinologist or allergist or dermatologist.
So we shouldn’t expect that an environmentalist or a solar power advocate or a sustainable gardens presenter needs advanced qualifications in the various sciences underlying the material presented.
It all comes down to trusting expertise. Unfortunately, to trust expertise you first have to acknowledge it. This is where people like the anti-vaccination crowd and climate/ acid rain/ ozone depletion deniers come in. They balk at the very first hurdle.
They come up with all sorts of rationalisations and excuses, but that’s what it boils down to. They can’t face acknowledging that someone, many someones, may be qualified by virtue of training, experience and proven performance to have their statements taken as an accurate description of facts. When you realise that some people refuse to accept that deaths from whooping cough occur (and increase) in areas where vaccination rates fall, you have to face it that some people are willing to suspend all claims to rational thought when facts contradict their ideological preconceptions.
And a health advisor doesn’t need to be an epidemiologist or a statistician to present the bald proposition that decreasing vaccination rates will certainly result in illness, disability and death for some children. They only need to cite the figures prepared by others.
Same thing goes for people presenting information on climate issues. They may have a particular point to make about effects on crops or animals, or on the virtues of various renewable power technologies, or even on political options for taxes or fee and dividend or whatever. They don’t need to be climatologists or economists or ecologists or agriculture experts themselves to present valuable information to others. They just need to collect the best information they can from the real experts.
PS for Ira, when there’s no FAQ, I try to ask a question in a likely place, in a way interesting enough that someone knowledgeable may be tempted to take time.
For example that double peak in Fig. 3 you ask about above. I’d thought I understood it, but your posting above made me wonder again. So I asked (the old Usenet way, post what you think and await correction).
PPS for Ira, re where to ask more questions:
Nathan Urban wrote at the above-linked interview page:
“… I don’t want to give a tutorial on climate sensitivity here; for that, see my earlier interview on the Azimuth Project and Planet 3.0 blogs. In particular, I recommend reading the discussion of “feedback effects”, which can combine with greenhouse or other warming to cause more (or less) warming than the greenhouse effect alone.”
“It all comes down to trusting expertise. Unfortunately, to trust expertise you first have to acknowledge it. This is where people like the anti-vaccination crowd and climate/ acid rain/ ozone depletion deniers come in. They balk at the very first hurdle.”
Which is where a little due diligence comes in. It would be more problematic if there were rational grounds for controversy. But in these cases some simple fact checking and asking some hard questions about the logic of the rhetoric, should fairly quickly lead one at least into the ballpark of where the more reliable professionals are.
Unfortunately too many people have a knee jerk reaction to “Authority”. So ironically they defer to authorities who puff and blow about being mavericky outsiders, down to earth real people who challenge the innate meanness of authority driven institutional bureaucracy (which in pop culture is always evil, so you instantly know who to root for and who the black hats are). It’s the ultimate short cut and therefore very attractive.
Ray Ladbury said, “However, since the predominant negative feedback in the climate system is thermal IR radiation, it seems odd to me that you’d get drastically different sensitivities.”
Why would it be odd? CO2 can only return a portion of the energy it absorbs and only return in its spectrum. The energy model that Schmittner appears to be using accounts for the available energy, which provides the energy for CO2 return. The energy available in the ocean for CO2 return varies much less that land. The Northern hemisphere with its greater potential change in albedo and a Gulf Stream energy would have a greater temperature response than the Antarctic which is much more stable. CO2 can only return energy, not manufacture it.
with the big caveat that I haven’t actually read the paper yet (I have it in request, for some reason my university access won’t give it to me)…
I don’t think the land-ocean diagram (figure 3 of the Nathan Urban interview) means different climate sensitivities of land vs. ocean; rather it means different global climate sensitivities implied when you use land or ocean data (and thus, the two curves really should overlap much more to be believable). The whole point of a lot of these perturbed physics ensembles is to generate a simulation with a lot of (usually subjectively chosen) parameter variation (that affects things like clouds, etc) and thus the ensemble is meant to contain many members which probe a large range of uncertainty. Then, observations (of the LGM or industrial climate) are used to constrain which subset of those models are allowable.
In some studies (e.g., Stainforth et al., 2005) you get a distribution that is very wide (accepting sensitivities as high as ~11 C) indicating that they probably admit members a bit too easily.
This type of analysis is different than some of the studies that try to use just observational data (e.g., some of James Hansens papers, where you try to get an estimate of dT and the forcing, then take the ratio). The problem with these is that you need to assume a constant climate sensitivity between base states when trying to extrapolate the results to a 2xCO2 world. The ensemble method gets around this but then assumes that the change in feedback between two states is well simulated by the model.
I think my reservation stems from the seeming inconsistency of saying that land and ocean sensitivities are sufficiently different to merit different pdfs but then saying you can recombine them to get a meaningful pdf for overall planetary sensitivity. If in fact the oceans (75% of the planet)warm significantly less than the land (25% of the planet), then won’t the majority of the increased IR radiation (and warming) be coming from the land?
Another issue I have with it is that we know that one sure way to get a low sensitivity is to assume a small time to re-equilibration (a la Schneider). We know that the time for the oceans to equilibrate is far longer than the atmosphere/land, and exactly what that time is is less certain than for land. Could we be seeing an underestimate of the time to equilibrium.
The thing is that either effect would tend to force land areas to warm much more than oceans–and effectively, if you look at their land-only pdf, this is what we see. There’s a whole helluva lot more probability above 3 degrees per doubling than below. For land, where most of us live, the high thick tail is still there.
I think that it is a mistake to sell this study as diminishing concern. It either says that land will warm much more than oceans permanently, or that it will warm much more and then the land temperature will decrease as the oceans catch up (assuming they don’t belch out a bunch more CO2 in the process).
In our study, “climate sensitivity” is a global average quantity, and we don’t consider separate land or ocean sensitivities.
The green curve in the figure shows the global climate sensitivity if we analyze only land temperature data. This is different from “climate sensitivity on land”.
However, the land and ocean do warm by different amounts in response to doubled CO2, so you could define separate land and ocean sensitivities.
Ok, I don’t know if this is actually clearer, but the point is that we’re estimating a global sensitivity using different subsets of data, rather than estimating local sensitivities that apply to different parts of the Earth.”
What I expected, certainly naively, was something like:
T2x(270) is likely between 2-4.5ºC, so T2x(390) is likely to be in the same range +/- 0.5ºC or so.
I’ve looked through the links you and others provided, and read some through, and now I feel like the guy who asked a clockmaker “What time is it?” and received a detailed explanation of “How to build a clock” – and I still don’t know what time it is!
I know the climate system is extraordinarily complex, way beyond what I ever dealt with as an Avionics System Engineer, but I suspect that global warming, while it needs to be informed by exacting climate science, is, to the larger human society, more of a system engineering problem. (Yeah, give a kid a hammer and everything starts to look like a nail, so my engineering background makes everything look like a system :^) but hear me out, please.
A system engineering team has to come up with alternative system concepts and design implementations, estimate performance, cost, and schedule, and, to win the contract, propose a comprehensive solution the customer set judges to offer the most favorable cost/benefit.
Accepting the IPCC predictions, we (humanity – as represented by national governments and international groups like the IPCC, World Bank, etc.) have a limited range of alternatives for dealing with past, present, and coming warming and the consequences that flow therefrom:
1. Do nothing and muddle through as best we can, adapting to the floods, droughts, storms, crop failures, and other resultant disruptions.
2. Hope that #1 will be ameliorated as cheap sources of carbon fuels deplete, causing prices to rise and self-interest to lead humanity to avoid wasting energy, being more efficient, and adopting alternatives that are carbon-neutral (biomass) and/or carbon-free (water, nuclear, wind, solar, …)
3. Shame industries and people into voluntarily adopting green lifestyles.
4. Speed adoption of #2 via a Hansen/Krauthammer (and Ira :^) carbon tax at the mine, well, and port with 100% of the revenues distributed on an equal per-capita basis to citizens and legal residents. This will make carbon-neutral/free energy relatively less expensive, so smart industries and people will adopt them for their own selfish interests.
5. Mandate drastic, international cap & trade on carbon energy and provide taxpayer-funded R&D, subsidies and loan guarantees to favored alternative energy industries, until carbon usage declines.
#1 will inevitably cause millions of people to die (along with other animals). But those of us with money and guns will adapt. #2 may ease the pain somewhat. As for #3, yeah, right.
#4 is politically difficult because many of us think the money will be kept by governments and that powerfully connected industries will be granted exceptions. And, what are the chances that BRICs (Brazil, Russia, India, China) and other rapidly modernizing countries will agree?
#5, even if it was politically possible on an international scale, which it certainly is not, may not be technically feasible without, at least in the short-term destroying modern, fuel-dependent industry and society as we know it.
So, what will it be?
[Response: This really isn't the right blog to be discussing policy options. That's not what we're experts in. My two cents here is that this is a politically vexing issue but that there are plenty of people with suggestions about how to do something along the lines of 4, 5, but without doing anything 'drastic'. And indeed you are forgetting 6 which is to invest in innovation and 7 which is to stop drastically subsidizing the fossil fuel industry. Suggesting as you do that our only options are in effect to 'do nothing' or to 'end society as we know it' are, well, alarmist. Vexing problems are vexing. That's no reason to either deny them or ignore them. --eric]
Ira, to a first approximation, a doubling from 270 to 540 will raise temperatures by ~3 degrees–assuming we started at equilibrium at 270 ppmv. Likewise 390 ppmv.
As to the system-engineering approach–first you have to get people to accept that there is a problem that needs fixing.
I am not as sanguine as you that guns and money will carry the fortunate through what is to come if we do nothing.
As to the BRIC response–they stand to lose as much or more as we do if we wreck the climate as human population reaches its crest. What is more, these economies are strongly export driven–tarriffs of the threat thereof could be strongly persuasive.
If we had started to address the unsustainability of global human civilization back in the ’70s when it became glaringly obvious, we might have had a chance with your option 3) + Gavin’s options 6) and 7). As it stands now, it will have to be a balls-out race to develop and deploy new technology before the response of the planet renders all our actions moot. If we rise to the challenge, we could be in for some very exciting times, and a reasonable probability–though not certainty–of heading off the worst outcomes.
Unfortunately, I still see more people interested in debating the validity of science that has been accepted since before Darwin, Maxwel and Boltzmann. That is not a hopeful sign.
My original interpretation was correct; the authors are not looking at two different sensitivities (land or ocean), which made little sense when I seen some people talking about that. I haven’t read any of the blog commentary apart from the Nathan Urban interview, so I apologize if any of this is repeating itself:
The type of analysis done by this paper has already been done before (e.g., Schneider von Deimling et al., 2006, Climate sensitivity estimated from ensemble simulations of glacial climate , Climate Dynamics). It appears to me that the main motivation for publishing this one was based on using a different “observational” dataset, one that people seem initially skeptical of due to the fact it allows for seemingly bizarre low amounts of temperature change between the modern and LGM (closer to 2-3 K cooling than 5-6 K that most people believe). In fact, the “best estimate” of the Schmittner paper still underestimates cooling in many regions, notably in Antarctica and in various mid-latitude regions.
The climate sensitivity results are also not robust to the data one believes the most. The problem is in essence similar to the deep-time greenhouse climate issue of the very small pole-to-equator temperature gradient. One of the big issues here is that if you force a model with a lot of CO2, you warm the tropics and the poles, but you don’t warm the poles enough to match observations. If you put more CO2 in, then the poles get warm enough, but then the tropics are too warm (some improvements in deep-time proxies have made the tropics warmer, but this is still an open issue).
Somewhat similarly, the idea of this study is to see which sensitivity (out of a wide ensemble of possible sensitivities, based on spanning a large parameter-space) will best generate agreement with observations. Thus, for example, if you “allow” a model with a very high sensitivity, then the agreement is very poor since the changed boundary conditions then results in something similar to a snowball earth (which is clearly not in agreement with LGM observations). If you use a sensitivity too low, then the LGM is not cold enough to match observations.
The problem in the Schmittner paper is that there are really no “good” agreements with observations- even if you match observations globally with your model, you don’t match a lot of the spatial variation (this is a problem of a simple model, but the observations are incomplete and almost certainly problematic).
Of course, how the boundary conditions during the LGM (relative to modern) were different is also not well known. We can estimate GHG concentration changes well, but other things (like aerosol and vegetation change) is not well known. Even things like the correct height of the ice sheet have impacts on local temperature, because you need to correct for the fact that an extra mile high ice sheet might give you 10 C local cooling because of adiabatic cooling of air being advected over the region. (Note that the presence of ice sheets is not typically considered an ice-albedo feedback, but rather an imposed boundary condition, and is thus the primary forcing- along with GHGs- driving the LGM to modern temperature difference. Hansen’s preferred method of using observations does the same thing.)
In my above post, I also forgot to touch on using these studies as a template for the future. It’s also a good idea to keep in mind how they define what is a forcing and feedback. The Schmittner paper for example defines vegetation changes as an internal feedback, and thus they don’t need to worry about the uncertainty in surface (vegetation) albedo forcing, that other papers do (like Kohler et al 2010). It’s not obvious whether the change in feedbacks from one climate state are easily translatable to a future CO2 world though
The problem is in essence similar to the deep-time greenhouse climate issue of the very small pole-to-equator temperature gradient. One of the big issues here is that if you force a model with a lot of CO2, you warm the tropics and the poles, but you don’t warm the poles enough to match observations. If you put more CO2 in, then the poles get warm enough, but then the tropics are too warm (some improvements in deep-time proxies have made the tropics warmer, but this is still an open issue).
[Response: This whole 'can't capture the gradient' stuff is somewhat dead as a research topic now, because the latest evidence says that the tropics actually did warm a lot, and getting the gradients to match the observations is just not that out of the ordinary for mid-range sensitivity GCMs. See the paper here (open access): The early Eocene equable climate problem revisited by Huber and Caballeroa (2011) in Climate of the Past.--eric]
Comment by Pete Dunkelberg — 27 Nov 2011 @ 11:49 PM
Dr. Steig- thanks for the citation! I hadn’t seen that paper before, but it looks like the sort of thing that will become a standard reference. I will need to read it this week. I wish I had more time to follow up on all the latest developments in that field, it’s very neat.
Thanks thingsbreak– I did a blog post on the Kump and Pollard paper a few years ago, which talked about aerosol responses that could help “the problem” (if there is one anymore). That paper, at least, relates to the Cretaceous, though there’s no reason it couldn’t apply to the Eocene too, and I’ve seen it cited for more general application. There is, of course, a lot of interesting papers that discuss these types of theoretical and “plausible-but-unconstrained” feedback effects that might be “missing from models.” Some of this might affect the relative pole to equatorial differential. Kerry Emanuel has some stuff on how hurricanes might act as a sort of “tropical thermostat” in a deep-time application, but a lot these things are no better than good hypotheses. I find them very interesting to think about though, even if they prove to be impossible to test.
The people who develop proxies are bright though, and some of the stuff they think of dazzles me, so I wouldn’t discount much as being untestable. I still think there’s a lot of interesting questions related to the pole-to-equator temperature gradient. Of course, just because you can get a model to work doesn’t mean you understand the physics behind why it works and that is where I still think there might be cool research to be done.
Dallas @321 might have his little joke (@324 & @329) but I don’t think future generations will be laughing when AGW begins to bite. Of course, skeptics are also having a good chuckle at present in the aftermath of the second batch of the leaked e-mails now two years old.
Strangely no sign from skeptics about the freshly leaked UEA e-mails. Apparently they aren’t as serious (not by a long chalk) but some type of gag notice is preventing their mention in the media. The link below gives the info that can be published. I think this account is going a bit too far but I’m told that does strengthen their legal position.
Book: “FOOL ME TWICE; Fighting the Assault on Science in America” by Shawn Lawrence Otto; Rodale Books
“When speaking about science to scientists, there is one thing that can be said that will almost always raise their indignation, and that is that science is inherently political and that the practice of science is a political act. Science, they will respond, has nothing to do with politics. But is that true?”
“But beyond that, science constantly disrupts hierarchical power structures and vested interests in a long drive to give knowledge, and thus power, to the individual, and that process is also political.”
——–The latest war cry propaganda masterpiece from skeptics;
“no warming since 2002″……….
…….. really? Are you sure? Peek at what talking about cooling looks like: http://eh2r.blogspot.com/.
I fear the warming is faster than the time we have to convince some action against it.
High Arctic small glaciers are melting at an unimaginable pace, at a place with no industries, under a quiescent relatively inactive sun, where its sunless, in deep darkness for 3 months with no proximate heat from any warm ocean.