Sashka, in regards to your second question, Seager’s work does not imply that the gulfstream does not contribute to Europe’s climate. It does imply that the gulf stream does not significantly contribute to the asymetry in temperature between the East Coast of the US and Western Europe.
Can someone explain to me how oxygen isotope ratios are effected by salinity? I can pretty well understand how temperature effects the ratio but can’t get a grasp on how salinity and hence density enters the picture.
[Response: The relationship is somewhat indirect, and results from the fact that there is an isotopic fractionation involved in the evaporation of water from the ocean surface, whereby the lighter isotopes of oxygen are preferentially evaporated, increasing the proportion of higher isotope oxygen left behind. Salinity is also influenced by the rate of evaporation (more precisely, by the difference between the evaporation and precipitation rates and, in some instances, the added effects of coastal runoff), and hence there is a relationship between salinity and oxygen isotopic content of ocean-dwelling organisms that produce calcareous shells or skeletons. There is a very nice discussion of this available at this educational website. – mike]
Thanks Mike. I can see now where it works in the Gulf of Mexico where there would be no unexpected flush of fresh water. The method wouldn’t be as applicable in areas where large flushes of fresh water from very rapid ice melting or ice dam bursts such as further north.
Well, same observation here that in the historical climatology of Greenland discussion (unanswered question). How do we explain that North Atlantic region is sensitive to a moderate fall in solar forcing (LIA), amplified by NAO or GS, but will not be very sensitive to a high increase in GHGs forcing (AGW)?
[Response: The physical mechanisms in the two cases are not the same. In Delworth and Dixon, the increase in NADW is due to the increased wind forcing, potentially due to lower startospheric/upper tropospheric forcing. For the long term declines in NADW under increased GHG forcing, the changes are driven by warming and freshening of the upper ocean. Different physics, different response. -gavin]
A cooler 17th century in Australia identified here – the LIA? Anything to do with the Gulf Stream?
Five centuries of climate change in Australia: the view from underground. Pollack, H. N., Huang, S. and Smerdon, J. E. 2006.
J. Quaternary Sci., Vol. 21 pp. 701â??706.
ABSTRACT: Fifty-seven borehole temperature profiles from across Australia are analysed to reconstruct a ground surface temperature history for the past five centuries. The five-hundred-year
reconstruction is characterised by a temperature increase of approximately 0.5 K, with most of the warming occurring in the 19th and 20th centuries. The 17th century was the coolest interval of the
five-century reconstruction. Comparison of the geothermalreconstruction to the Australian annual surface air temperature time series in their period of overlap shows excellent agreement. The full geothermal reconstruction also agrees well with the low-frequency component of dendroclimatic reconstructions from Tasmania and New Zealand. The warming of Australia over the past five centuries is only about half that experienced by the continents of the Northern Hemisphere in the
same time interval.
#5 – Different forcings = different sensitivities. I don’t see that it requires a particular explanation, although understanding it would be interesting.
Furthermore, the “lack” of sensitivity of the North Atlantic to AGW, would be due to a weakening of the MOC caused by extra freshwater [itself caused by AGW]. This weakening of the MOC would weaken the ATlantic heat transport, imposing a cooling signal on top of the warming signal from AGW.
It doesn’t imply that the North Atlantic is particularly “stable” in response to AGW.
It’s also worth reiterating the point that the Greenland discussion was centred on a relatively small area of South Greenland, so comparing the stability of a small area with a larger area is not comparing like with like.
I would find it very hard to believe that an effect of lighter isotopes being statistically found to be preferentially evaporated over heavier isotopes would be anyway be meaningful unless a huge amount of data has been collected. For one thing, the forces that hold molecules together purely on atomic mass are very weak indeed, but secondly, the hydrogen forces that are shared by all molecules of water are many orders of magnitude higher so would make any isotopic effects negligible.
[Response: Yes, I suppose the textbooks must all have got this wrong, and that we really have no explanation for the pronounced glacial/interglacial cycle in ocean sediment oxygen isotopes. – mike]
[Response: If, as suggested by several of the studies cited above, the changes in AO/NAO were largely forced (by a combination of solar and volcanic forcing), then several of the signatures of the ‘LIA’ (e.g. winter cooling in Europe) would be interpreted as a forced response. This is in contrast to the DO cycles, etc. which are generally (though not universally!) believed to represent an internal instability in the climate (analogous to e.g. ENSO), rather than a forced response of the climate system. The ‘LIA’ and DO events, moreover, took place under very different boundary conditions. There was a much greater amount of ice around during the last glacial period when the DO cycles were prominent, and it is unclear that similar such modes of internal variability exist in an interglacial period such as the Holocene. – mike]
Re 8 (by Russ Hayley):
A short comment on the physical chemistry behind isotope fractionation: Isotopic effects in chemical reactions are usually related to differences in vibrational energy levels, not directly to “the forces that hold molecules together”. The potential energy hypersurface of any molecular system (from which these “forces” arise) is indeed more or less independent of the atomic isotopes (though couplings between electronic and atomic degrees of freedom may have small effects). However, it should be remembered that (because of quantum, as Terry Pratchett would say) real molecules always have some degree of vibrational energy. Furthermore, the vibrational energy levels, including the lowest possible energy level (called the zero-point energy) are strongly dependent on the atomic masses. (Look up the formula for the energy levels of a quantum mechanical harmonic oscillator to see why.) For most common chemical reactions, vibrational contributions constitute at least some percent (often much more) of the free energies of reaction or the free energies of activation. Now, the thermodynamics (equilibrium constants) or kinetics (rate constants) of reactions are roughly dependent on the exponentials of these free energies. So the relatively small but non-negligible (maybe 0.1%-1%, much more for some reactions) differences between the reaction or activation free energies of two different isotopomers will have a significant effect on the equilibrium or rate constants which determine how much of which isotopomer is formed (or depleted, or whatever). This is pretty standard stuff, look up “isotope effects” in any decent kinetics textbook and you’ll find examples. For example for reactions involving proton transfer (a prime example of your “hydrogen forces”) exchanging normal hydrogen atoms for deuterium has a huge effect on all reaction parameters. (OK, that also has to do with quantum mechanical tunneling which is again a different issue – and incidentally even more strongly isotope dependent.)
NB. For physical (as opposed to chemical) reactions such as evaporation the main reason for isotope selectivity is probably more directly related to the mass of the molecules via kinetic energy terms, diffusion etc. (Trick question for Russ: if isotopic effects are negligible, why do diffusion constants depend on the molecular mass…?)
“However, the dynamical response associated with the NAO/AO pattern, which sits on top of the radiative response, leads to greater cooling in some regions at the expense of less cooling (or even warming) in other regions.”
“Shindell et al (2001) alluded, however, to the possibility that a sustained negative phase of the NAO would lead to a minor weakening of the North Atlantic meridional overturning circulation,”
Since we are not yet sure how all these work, how can the current generation of AOGCM multi-decadal temperature projections have much skill in regional climate prediction? Fluid dynamics governing these ocean circulation changes is a complicated matter. That is not the entire problem though, as we then must model how associated changes in the heat distribution affects the climate on regional scales. Do the models do a good job yet? People are interested in more than just mean sea level, but whether or not lingenberries will grow in Sweden in 2070, or they might want to try growing date palms instead.
In Theo’s comment it needs to be pointed out that the relevant bond is the “hydrogen bond” between the hydogen on one atom and the lone pair of electrons on the other. Hydrogen bonding is what makes water so interesting a fluid (and without which we would not exist either). Therefore his comment about the isotope effect occuring through the alteration of zero point energy and the vibrational frequency is correct.
A bit of history: when the Manhattan Project began, one of the sources of deuterium was water collected from large steam boilers that had been operated for decades without ever being flushed out; the water was slightly enriched in ‘heavy water’ — deuterium2 oxide — because the lighter hydrogen2 oxide had boiled off slightly faster over those many years.
Tiny effects do always make a difference in reality, if not always in political policy.
A recent paper argues convincingly that forest regrowth in the Americas following the decimation of American Indian societies in the 1500s and 1600s is the reason for the 10 ppm decline in atmopsheric CO2 around 1600 deduced from ice cores, which would have led to slightly lower temperatures globally:
Evidence for the Postconquest Demographic Collapse of the Americas in Historical CO2 Levels
Franz X. Faust, Cristobal Gnecco, Hermann Mannstein, and Jurg Stamm
ABSTRACT: This article promotes the hypothesis that the massive demographic collapse of the native populations of the Americas triggered by the European colonization brought about the abandonment of large expanses of agricultural fields soon recovered by forests, which in due turn fixed atmospheric CO2 in significant quantities. This hypothesis is supported by measurements of atmospheric CO2 levels in ice cores from Law Dome, Antarctica. Changing the focus from paleoclimate to global population dynamics and using the same causal chain, the measured drop in historic atmospheric CO2 levels can also be looked upon as further, strong evidence for the postconquest demographic collapse of the Americas.
[Response: The radiative forcing associated with a 10 ppm decline in CO2 from mean pre-industrial levels is quite small, and any temperature change would be dwarfed by the response to other forcings such as explosive volcanism and solar irradiance variations. An alternative argument [e.g. Gerber et al (2003)] holds that the observed small changes in pre-industrial CO2 concentrations, rather than forcing changes in surface temperature, were instead largely forced by increased terrestrial carbon uptake associated with temperature changes caused by a combination of the natural forcing factors (solar irradiance and volcanic forcing) mentioned above. -mike]
That looked like an interesting site, until I read this:
“Different atoms of the same element are called isotopes. All oxygen atoms have 16 protons and 16 electrons, but some oxygen atoms have 16, 17, or 18 neutrons in the nucleus. The most abundant isotopes of oxygen in seawater are oxygen sixteen (16^O) and oxygen eighteen (18^O).”
If I’m not mistaken, oxygen has only 8 protons (atomic number of 8), and 8, 9, or 10 neutrons (atomic mass of 16-18). It appears the author was confusing molecular oxygen and atomic oxygen. I was hoping to learn something from that site, but I think I will look elsewhere.
Thereafter, an interesting work. If GS axis depends of NAO, could we imagine that the first factor you identify (Shindell et al.) influence the second factor (Lund et al.) ? Solar forcing > AO/NAO change > GS change northward/southward > THC response (more/less overturning). I don’t know if such poleward trends of GS could have any influence on the subsequent circulation (strength of the sink of dense water).
Gulf Stream variability and ocean-atmosphere interactions
FRANKIGNOUL Claude (1) ; DE COÃ�TLOGON Gaelle (1) ; JOYCE Terrence M. (2) ; SHENFU DONG (3) ;
Journal of physical oceanography (J. phys. oceanogr.)
2001, vol. 31, no12, pp. 3516-3529
Abstractc – Time series of Gulf Stream position derived from the TOPEX/Poseidon altimeter from October 1992 to November 1998 are used to investigate the lead and lag relation between the Gulf Stream path as it leaves the continental shelf and the changes in sea level pressure, surface wind stress, and sea surface temperature (SST), as given by the NCEP reanalysis. The dominant signal is a northward (southward) displacement of Gulf Stream axis 11 to 18 months after the North Atlantic Oscillation (NAO) reaches positive (negative) extrema. A SST warming (cooling) peaking north of the Gulf Stream is also seen to precede the latitudinal shifts, but it is a part of the large-scale SST anomaly tripole that is generated by the NAO fluctuations. There is no evidence that the Gulf Stream shifts have a direct impact onto the large-scale atmospheric circulation. A fast, passive response of the Gulf Stream to NAO forcing is also suggested by a corresponding analysis of the yearly mean Gulf Stream position estimated from XBT data at 200 m during 1954-98, where the NAO primarily leads the latitudinal Gulf Stream shifts by 1 yr. The fast Gulf Stream response seems to reflect buoyancy forcing in the recirculation gyres but, as the covariability remains significant when the NAO leads by up to 9 yr, large-scale wind stress forcing may become important after a longer delay. Because of the high NAO index of the last decades, the TOPEX/Poseidon period is one of unprecedented northward excursion of the Gulf Stream in the 45-yr record, with the Gulf Stream 50-100 km north of its climatological mean position.
PS : in your text, correction suggested : “An alternative view would suggest that >the< rather than the MOC being a consequence of radiative and subsequent atmospheric change…”
Sometimes new data seems to make us step backward before we can advance our knowledge. It is crucial to find data sources that identify the former activity of ocean currents. I also like the idea of a strenghtened NAO/AO playing a role in the LIA, but that does cause warming in other areas. Lest we forget glaciers in Bolivia, Peru, Chile, New Zealand, California, Wyoming, Uganda, New Guinea, Nepal to name a few non- North Atlantic locations all had substantial glacial advances in the 17th century-early 19th century. how do we then argue that the LIA did not have a global signal, though I see that need not be completely synchronous.
Sometimes new data has us taking a step back not forward in our understanding. It is important to find means to identify the past activity of ocean currents. I also see the merit in an NAO/AO relationships with the LIA, but this does cause warming in some regions and cooling in others. When it comes to the LIA let us not forget that the cooling signal according to glaciers, which are sensitive was not limited to the North Atlantic region. Glaciers experienced substantial LIA advances in the 17th-early 19th centuries that have been dated from Peru, Bolivia, Chile, Argentina, Nepal, Kazakhstan, New Guinea, New Zealand, Uganda, the Cascades from California-Washington, the Rockies from Colorado-Alberta. Might regional forces still have the capacity to generate more widespread cooling?
Here is an interesting article on the issue of drought in the American West; it briefly touches on the little ice age periods as well, though the ‘megadrought’ precedes the LIA. The factors in the megadrought seem to be the opposite of those in the LIA, i.e. less volcanism and more solar forcing.
“For several hundred years until about 1400AD, at any one time, more of the West was experiencing drought than occurred for centuries after. This has been called a Medieval Megadrought. During Medieval times the American West was an even drier climate than the current already arid one. Apparently this very arid climate greatly stressed the social organization of Indian societies at the time, according to archaeologists. Around about the fifteenth century the climate shifted to being much wetter and remained that way for several hundred years, coinciding with the Little Ice Age, a period when climate was cooler than now around the world. There also appears to have been a return to a more drought-prone climate in the last two centuries.”
The fate of Western water supplies under global warming seems to depend on whether it results in more El Nino (warming of the eastern Pacific) or La Nina conditions (with a warmer western Pacific). Water supplies are already very tight and are a subject of political conflict. To continue from the article:
“Currently climate models are all over the map in how the tropical Pacific Ocean responds to rising greenhouse gases. The climate modeling group at Lamont has argued that rising greenhouse gases will warm the western tropical Pacific Ocean by more than the eastern ocean because, in the west, the increased downward infrared radiation has to be balanced by increased evaporative heat loss but in the east, where there is active upwelling of cold ocean waters from below, it is partially balanced by an increase in the divergence of heat by ocean currents. As such, the east to west temperature gradient increases and a La Nina-like response is induced. This is the same argument for why, during Medieval times, increased solar irradiance and reduced volcanism could have caused a La Nina-like SST response, as seen in coral based SST reconstructions.
If the Medieval period is any guide to how the tropical Pacific Ocean and the global atmosphere circulation responds to positive radiative forcing then the American West could be in for a future in which the climate is more arid than at any time since the advent of European settlement.”
[Response: In fact, a mechanism for this hypothesis was first demonstrated in this article in the Journal of Climate which I co-authored with Mark Cane, Steve Zebiak, and Amy Clement. We showed that the Cane-Zebiak model of the tropical Pacific coupled ocean-atmosphere system responds to estimated radiative forcing changes over the past 1000 years in such a way as to produce a tendency for “El Nino” conditions during the traditionally defined “LIA” and “La Nina” conditions during Medieval times. This appears consistent with coral evidence of El Nino changes in past centuries, and with evidence of drought changes in North America and Africa. -mike]
There appears to be reasonable historical correlation between the factors of gulf stream (and other associated) currents, global temperature signature, and past global warming/ice age patterns. Is there a rough consensus amongst climate scientists about the implications of this for the future in layman’s terms? Or is it only something that will allow us to explain future changes after they happen? It appears to me that there is likely future “disruption” in the currents, which will have particular (predictable) regional climate outcomes.
Since the Gulf Stream (the narrow and intense western boundary current off the Floridan coast) is primarily a result of a wind-driven gyre in the sub-Tropics, I’m not really sure if a geostrophic framework will provide the correst estimates of its intensity. Or have I missed something important?
[Response: Rasmus–it’s true that the western boundary currents strictly speaking are not geostrophically balanced–for example, non-linear terms appear to be important in explaining the boundary separation that takes place near Cape Hatteras. However, one can nonetheless obtain quite reasonable estimates of the current speeds and directions from geostrophic calculations based on the dynamic topography (the laterally-varying height of the ocean surface, which is the primary determinant of lateral pressure variations in the upper ocean. -mike]
With reference to the above issues, I need help understanding the implications of this paper, by Sijp & England, in this month’s journal of Climate, and the second paper on isopycnal mixing by the same authors & Bates, same journal: http://ams.allenpress.com/perlserv/?request=get-abstract&doi=10.1175%2FJCLI3909.1
In particular, there have been discussions recently on changes in mixing patterns in the Southern Hemisphere, in addition to the above.
I know this is about current, rather than historic conditions, but it may also have implications with respect to our understanding of the modelled historic patterns; the second paper, in particular, with reference to hypotheses such as Broecker et. al.s’ freshwater ‘hosing’ as a causal factor in circulation phase shifts.
Hoping this makes sense…
Re 21: I cannot help interjecting that the Cane-Zebiak model, in which the temperature of the upwelling water in the Eastern Tropical Pacific is held fixed, is biased towards creating La Nina conditions as the tropics warm. Studies with full atmosphere-ocean models do not provide much support for this picture. (Of course, the CZ model has been invaluable in studies of El Nino dynamics and predictability — on time scales for which it is reasonable to assume that the temperature of upwelling water in the East is fixed.) In most GCMs, the Western US dries in the 21st century despite, if anything, a tendency towards El-Nino-ish conditions in the Pacific.
[Response: Thanks for dropping by Isaac. As always, your comments are much appreciated here. I don’t necessarily disagree to some extent. The limitations of the CZ model in this context are well known and accepted, and discussed quite openly in the manuscript I referred to. However, whether or not coupled models reproduce the behavior in question (i.e., the tropical Pacific ocean thermostat mechanism of Clement, Cane, and colleagues) may or not may or may evidence against that mechanism. If you look back in the RealClimate archives, you’ll see that there has been extensive discussion of this previously here. Cane and colleagues have questioned with some justification how well the Bjerknes feedbacks are actually represented in many of the coupled models that have been used in climate change experiments. Collins et al have shown that the coupled models span the full range of possibilities, with some indeed appearing to show signs of the ocean thermostat mechanism. Matt Collins has provided a thoughtful discussion here. On longer timescales, the thermostat mechanism may be less viable, as extratropical feedbacks such as subduction of extratropical water masses influencing the equatorial thermocline are likely to come into play. Arguably, such influences may undermine the efficacy of the mechanism in the context of anthropogenic climate change, or even the response to centennial timescale natural (e.g. solar) forcing. On interannual timescales, relevant to, for example, the response to explosive tropical volcanism, and where these decadal-scale feedbacks are less relevant, nature appears to have decided in favor of the mechanism. At least, that is the conclusion now of a number of peer-reviewed studies, including this latest one by Emile Geay et al (which is a quasi-independent followup to the Mann et al J. Climate article). -mike]
Since I am already here, let me also add Re 22: The Gulf Stream is very geostrophic. This has little to do with how it is forced but with how strong (or rather how weak) it is. The nonlinearity of importance for the separation from the coast is primarily nonlinearity in the vorticity balance, not in the momentum balance. That these two things are different is the essence of the theories that we use to analyze large scale atmospheric and oceanic flows.
[Response: I wonder if this is more than a semantic disagreement? The centripetal acceleration terms in the horizontal momentum equations (which, agreed, are distinct from the advective terms) are necessary to support inertial oscillations, whether one chooses to use the u and v equations alone, or form a vorticity equation from them. I was categorizing these terms in the momentum equations as “non-linear”, though one might instead choose to call them “time-dependent” terms. For a pure rotating flow, the time-dependent and centripetal acceleration terms in the momentum equations are one and the same. Alternatively, one can view this in terms of the Rossby Number(Ro). I believe we both agree that Ro for western boundary currents is not zero, but that it is typically small. In this sense, one can describe the currents as typically in geostrophic balance. That was the point I was making to Rasmus. On the other hand, Ro can locally exceed unity within the western boundary currents under some circumstances. Clearly in such a situation, geostrophic balance fails at least locally. Any elucidation of possible remaining points of disagreement appreciated. -mike]
[Response: Put differently, when there is significant curvature in the velocity field (for either oceanic or atmospheric motions), we know that the centripetal acceleration terms in the momentum equations become important, and the momentum balance is strictly speaking not geostrophic, but instead a gradient wind balance. When inertial oscillations are present in the Gulf Stream, then technically speaking one must use the gradient wind and not geostrophic calculations, to get the right answer. The geostrophic calculations will yield a correct direction, but the magnitude will be in error by some amount. Hopefully, the clarifies the point of contention? -mike]
Mike — I disagree with your claim in response to comment #14 that a man-made CO2 drop caused by the American pandemic would not have had a significant influence on the cooling into the so-called ‘little ice age’.
—The average northern hemisphere cooling from the ‘warm’ centuries (1000-1200) to the LIA (1400-1800) was only ~0.2C in your 1998 (GRL) reconstruction. Some reconstructions show larger N Hem changes, but they tend to over-represent the more reactive high latitudes. A full-global reconstruction will probably reduce this amplitude, as suggested by your preliminary attempt (Rev. of Geophys,, 2002), which estimated a global cooling of 0.1C or less.
—A correlative CO2 drop of 6-7 ppm during this same interval is now confirmed in three well-dated ice cores (Law Dome, Dronning Maud Land, and South Pole). This drop was sustained for centuries, so it should have caused an ‘equilibrium’ global cooling of ~0.08C. You cited model results from Gerber et al (2003) that this CO2 drop was natural (solar-volcanic) in origin, but I noted in 2003 (Climatic Change) that the Gerber model cannot explain the full CO2 drop without violating the small size of the N Hem cooling. If you are right that the global cooling was less than 0.1C, the Gerber model will give a CO2 drop of less than 1.2 ppm, a small fraction of the actual 6-7 ppm fall.
—Estimates of the sequestration effect of pandemic-driven reforestation of the Americas are converging on 10-12 GtC. I have estimated how this amount of carbon removal would have affected atmospheric CO2 by using the smoothing function from the Joos et al. carbon model published in GBC 2004. The Black Death and American pandemics account for a CO2 drop of more than 4 ppm, and deep-ocean cooling over many centuries can only have boosted this amount by increasing CO2 solubility. So — pandemics can plausibly account for at least 2/3 of the 6-7 ppm CO2 drop, almost half of the N Hem cooling, and well over half of the global cooling into the LIA.
[Response: Bill, thanks for the comment. I should have been a bit more careful in how I stated this. The associated forcing is small compared to some external forcings (e.g. volcanic) on multidecadal timescales, at least based on e.g. Crowley (2000) forcing estimates. The main problem is that it is difficult to detect small signals in the reconstructions due to the large uncertainties therein. And while some reconstructions may show greater variability, it is difficult to tell in many cases if this is real, or due to sampling biases such as you suggest. The main point of my comment was that there are at least two schools of thought on this, and I myself am reserving judgement as to which is correct (indeed, it seems reasonable to suspect that there is some truth to both). And let me very clearly state that I believe you’ve made had a very important impact on this discussion, and have made a credible argument for the possibility that at least some of the pre-industial CO2 changes are associated with anthropogenic influences on the environment. -mike]
A lower solar activity cools the stratosphere.
A consequence of this cooling is a more negative NAO.
And it induces a weaker MOC (or GS).
Ok, but now, with GHG effect, there is also a stratospheric cooling.
So my question is:
Are these coolings equivalent and, if this is the case, can we get this effect, plus the refreshening of the waters, to moderate the CW in North Atlantic?
[Response: The impact on the NAO is from the gradients of temperature in the lower stratosphere. With GHGs you get strat cooling, tropical tropo warming (increased gradient => increased NAO), while in the decreased solar case the strat cooling is mostly at low latitudes and so the gradient decreases (=> decresed NAO). At least, that is what the model suggests (though there is some observed data to back that up). – gavin]
Thanks for the comments, they were both enlightening and confusing. Obviously, the models are very complex especially in tropical regions where the ocean and atmosphere are closely coupled. These two papers based on observations may have some bearing on the issue.
(1) “Anomaly of heat content in the northern Atlantic in the last 7 years: Is the ocean warming or cooling?
V. O. Ivchenko and N. C. Wells,D. L. Aleynik,”
” Whether the North Atlantic Ocean is warming or cooling is an important question both in physical oceanography and climate change. The Argo profiling buoys provide an accurate and stable instrument for determining the tendencies in heat content from the surface to 2000 m from 1999 to 2005. To calculate temperature and heat content anomalies two reference climatologies are used. These are the well known WOA2001 climatology (Stephens et al., 2002), and a new WOCE Global Hydrographic climatology (Gouretski and Koltermann, 2004). The former climatology is used for our main results, and the latter is used for evaluating the sensitivity of our results to the climatology. Our scheme allows us to estimate the anomaly of heat content (AHC) in the North Atlantic and its smaller sub-domains (i.e. 10° boxes) for the period 1999-2005. We have found a dipole structure in the time averaged AHC: negative values are concentrated in the southern and middle latitudes of the North Atlantic whilst positive values are found north of 50°N. The upper 1500 m of the North Atlantic is warming throughout the period 1999 to 2005.”
There is a news report on this paper here (SMH, au). The reported warming is 0.015 C, but that corresponds to an 8 C increase in the atmosphere above, so it’s a signficant amount of energy. This report also includes some quotes from James Lovelock about thermal stratification of the North Atlantic leading to a nutrient-starved surface ecosystem – but I thought that the winter storms that deepened the mixed layer were the primary factor behind the spring plankton bloom in the N. Atlantic, so Lovelock seems to be overstating things a little..?
So, this seems to say that the North Atlantic is warming even though the Gulf Stream is slowing. Is the atmospheric heat transport the dominant factor? Is the Arctic itself warming due to the observed 20% decrease in persistent Arctic sea ice, and is that having a warming influence on the North Atlantic? On the one hand, Britain may not have to worry about freezing, but doesn’t this also point towards accelerated melting of the Greenland Ice Sheet, which is reflected in the freshening and weakening of North Atlantic Deep Water formation?
If I understand the above comments on the El Nino/ La Nina response of the tropical oceans, then regardless of which way it swings the Western US will be drier. I don’t understand why a coupled ocean-atmosphere GCM that predicted more El Nino-like conditions wouldn’t also predict wetter conditions in the Southwest, but maybe the El Nino effects would be more closely confined to coastal areas – or maybe the conditions will be more like a persistent La Nina. It seems that the answer will have significant effects on the future of Southwestern water supplies, in any case.
Here is something interesting (well, I found it interesting) that looks at Pollack and Huangs recent Australian borehole reconstruction to say something about correction of borehole proxys for surface conditions, and the global extent and depth of the Little Ice Age. Maybe.
Sorry about changing the subject but I really have to reply to a comment in a closed thread re: How much CO2 is too much #222 But more and more I see in the comments little teasers about how you ought to be doing more in the way of economics. Of course, in general that is not the plan. But also, economicis is heavily contaminated with ideology.
“Economicis?” No! you’re thinking of EconomiTis :). Care must be taken to separate what is economic science and what is economic ideology, just as it is important to separate what is environmental science and ideology. Until there is as good a forum as RC to discuss economic environmental details, I will point out discussions here that demonstrate economic illiteracy with regards to costs of GW and costs of reducing GHG’s.
Your task it would seem is difficult enough with quite a raft of deniers still staging attacks on the rear. So it is this writer’s hope that you will ignore such ecomic grumbles. That way lies the Slough of Dispair.
(Comment by garhane)
Just because some deniers have economic backgrounds doesn’t mean that other economist minded people will not attack their assumptions. People in the environment movement must become more economically literate if they are to make wise policy recommendations!
By the way, I this site is great, and the moderators are doing an excellent job moderating. I still don’t seem to have an answer as to my question about a consensus view amongst scientists in laymans terms regarding predictions arising from LIA/gulfstream research.
When you speak of a “CO2 drop of 6-7 ppm during the same interval”, do you allude to 1400-1800 ? If so, as CO2’s life in atmosphere is approx. 100-150 yrs, should’nt we estimate the equilibrium response of climate for a 1,5-2 ppm fall ? I probably miss the point, but it seems you add CO2 lessening for 4 centuries and then obtain a 1800’s sensitivity including the very first effects of 1400’s drop.
Another way for the same question : what is the maximum amplitude of temperatures inside the 1400-1800 period and are these century-scale trends accounted by some 1,5-2 ppm CO2 drops ?
You might want to consider what the error bars are on pre-1950 measuremenets and estimates of atmospheric CO2. I don’t know what the answer to that question is. but maybe +/- 1%? The Nature web site http://www.nature.com/nature/focus/icecores/index.html has lots of good links on this topic. The Greenland ice cores (GRIP, GRIP2 and NGRIP) also show good agreement on the oxygen isotope record (RE#8, etc.). The coral record from the tropics is also of interest (see mike’s respnse to my #20, as well as here. The conclusion seems to be that the tropical Pacific is an important driver of global climate, and is sensitive to periods of volcanic activity.
It seems to me that the take home message from the medieval megadrought and little ice age periods is that the climate is fairly sensitive to relatively small forcings; thus we should expect a strong response to the ongoing forcing of anthropogenic greenhouse gases.
“There is a close correlation between Antarctic temperature and atmospheric concentrations of CO2 (Barnola et al. 1987). The extension of the Vostok CO2 record shows that the main trends of CO2 are similar for each glacial cycle. Major transitions from the lowest to the highest values are associated with glacial-interglacial transitions. During these transitions, the atmospheric concentrations of CO2 rises from 180 to 280-300 ppmv (Petit et al. 1999). The extension of the Vostok CO2 record shows the present-day levels of CO2 are unprecedented during the past 420 kyr…”
It’s also worth looking at the Keeling Curve ; this figure ends at 367 ppm (2000) while current measurements are at 380 ppm. Increase rates over the past 5 years have been at 2.5%, compared to the pre-2000 rate of 1%.
RE#30, the main argument against ‘economic science’ (particularly the Chicago school of thought) is that consumers aren’t really rational actors who make choices based on the best information available. The $6 billion/year PR industry is bent on deceiving consumers -and if the foundation is rotten, the whole edifice comes tumbling down. Here is a very interesting blog on the issue. For an analysis of the economic benefits of a robust renewable energy industry, see here, for the economic damage of global warming see The Stern Report.
Re #25: Bill Ruddiman, I also have a question. You claim that pandemic-driven reforestation reduced CO2 by 6-7 ppm, which caused an ‘equilibrium’ global cooling of ~0.08C. But does this take into account the fact that temperate forests have a lower albedo than cropland, especially in winter? I have seen claims that the change in albedo more than compensates for the decrease in carbon dioxide, which would suggest that the reforestation would lead to warming rather than cooling.
>pandemic-driven reforestation of the Americas
That’s not modern bare-dirt-all winter agricultural farmland that was reverting to forest, remember. Bare dirt gets cold fast and holds snow.
“…. periodic low-intensity fires would have maintained high levels of vegetable and animal food production: vegetable food production for humans and wildlife would have been 20-100% greater than in unburned areas and large mammal and bird production would have been 100-400% greater. ” http://www.daviesand.com/Papers/Tree_Crops/Indian_Agroforestry/
Properly burned prairieland and burned forest are done often and carefully — what’s left behind isn’t toasted dirt, it’s quite a bit of duff and most of the native plants — — the prairie grasses thrive; forests get cleared out between the big trees.
“… for it being the custom of the Indians to burn the wood in November, when the grass is withered, and leaves dried, it consumes all the Underwood, and rubbish, which otherwise would overgrow the country, making it unpassable, and spoil their much affected hunting . . .
“…In 1607 Captain Gilbert described the trees at a point on the Maine coast–probably Point Elizabeth south of Casco Bay. They were “the most part of them ocke and wallnutt (hickory) growing in a great space assoonder on from the other as in our parks in Ingland and no thickett growing under them” . Richmond Island nearby had “fine oaks and nut trees with cleared land and abundance of vines which in their season bear fine grapes.”  ”
[same source, link above]
I don’t think you can assume an albedo change resulting from native American fire practices. Even modern Forest Service ‘prescribed burns’ — they’re trying to reduce the huge excess of fuel Smoky Bear has caused — are usually bigger and hotter and clear more land bare than good routine practice would do.
“Agriculture originated in North America about 10,000 years ago; about the same time it had in the Middle East (Smith 1989). By 1500, tens of millions of acres were cleared for crops. Native peoples everywhere in North America also set fire to hundreds of millions of acres on a regular basis to improve game habitat, facilitate travel, reduce insect pests, remove cover for potential enemies, enhance conditions for berries, drive game, and for other purposes.
“Vast areas of the North American forest landscape in both the West and East were, at the time of European contact, open, park-like stands shaped by short-interval, low intensity fires, often set purposely by humans. In New England, Indians burned the woods twice a year, in the spring and fall. Roger Williams wrote that “this burning of the Wood to them they count a Benefit, both for destroying of vermin, and keeping downe the Weeds and thickets” (Cronon 1985). John Smith commented that in the forests around Jamestown in Virginia “a man may gallop a horse amonst these woods any waie, but where the creeks and Rivers shall hinder” (Williams 1989). Andrew White, on an expedition along the Potomac in 1633, observed that the forest “is not choked with an undergrowth of brambles and bushes, but as if layed out in a manner so open, that you might freely drive a four horse chariot in the midst of the trees” (Williams 1989).”
“So why, in the light of all this evidence, do we continue to cling to the image of the forest primeval? … there was a particularly insightful essay on this subject by M.J. Bowden (1992).
“Bowden writes that popular images are created and perpetuated because they serve the ends of particular groups and opinion leaders. The image of the pristine forest has endured for 300 years or more because it has been useful to a variety of opinion leaders — from the Pilgrim Fathers of 17th century New England to the modern environmental movement.
“Bowden writes that:
“…’The grand invented tradition of American nature as a whole is the pristine wilderness, a succession of imagined environments which have been conceived as far more difficult for settlers to conquer than they were in reality….The ignoble savage, non-agricultural and barely human, was invented to justify dispossession…and to prove that the Indian had no part in transforming America from Wilderness to Garden.'”
“… Old World diseases. By the early 1500s, such diseases were introduced on both coasts, as well as the interior. Dobyns (1983) estimates that native populations collapsed from perhaps 18 millions in 1500 to less than 1 million in 1800, when the first waves of European expansion finally moved west of the Appalachians.
“In 1500, significant portions of the Midwest, Southeast and Atlantic coastal areas were home to highly structured, agricultural societies having high population densities and landscapes which were heavily cleared for cropland (Dobyns 1983; Denevan 1992). In the Midwest and Southeast, these people constructed extensive earthworks, mounds, large earthen pyramids, temples, and extensive areas of ridged agricultural fields (Doolittle 1992). They had a hierarchical social structure similar to that of the Aztecs and Incas.
“While we will never know fully the extent of forest clearing by these people, some indication can be gained from the writings of a Spanish chronicler on the 1539-43 de Soto expedition. For four years, de Soto and his men pillaged, plundered, and inadvertently spread diseases from Florida north across the Appalachians, west to the Mississippi, thence down to the Gulf of Mexico (Thomas 1993). Of Indian agricultural fields in Florida, he wrote that they:
‘…marched on through some great fields of corn, beans, and squash and other vegetables which had been sown on both sides of the road and were spread out as far as the eye could see across two leagues of plain’ (Doolittle 1992).
“Dobyns (1983) has estimated that this single field covered about 16 square miles. These were no small family garden plots!
“The first waves of native depopulation from smallpox occurred shortly after 1500, even before de Soto, and were followed by successive waves as new diseases were introduced and took their horrible toll. This holocaust, which encompassed all of the Americas, took place largely out of sight of Europeans. By 1800, native populations were a shadow of their former numbers, and their social structure had been substantially disrupted (Dobyns 1983; Cronon 1985; Thomas 1993). Landscapes cleared for agriculture had 2-3 centuries to reforest before the first waves of permanent European/American emigrants poured through the Appalachian “gaps” to find landscapes that were more “pristine” than they had been in more than a thousand years (Denevan 1992).”
Re #34: >I have seen claims that the change in albedo more than compensates for the decrease in carbon dioxide, which would suggest that the reforestation would lead to warming rather than cooling.
The temperature of vegetation follows air temperature. The reason being that photosynthesis uses solar energy for growth rather for heating up the leaves. This simple fact becomes very evident when viewing thermal imagery of vegetated areas over a period of 24 hours. Hence, reforestation does not lead to additional heating as long as the vegetation is alive.
Comment by Jan Sjoerd De Vries — 6 Dec 2006 @ 12:41 PM
Re #37: The photosynthesis reaction is less than 30% efficient, and it only uses less than half the available sunlight (wavelengths of 400-700 nm). I do not think it has much effect on warming. On the other hand, transpiration increases water vapor levels, which could increase cloud cover thus reducing overall albedo. Also, according to this Science paper, “the forest provides an aerosol population of 1000 to 2000 particles of climatically active sizes per cubic centimeter.”
These effects may more than compensate for the increased albedo, but I have never seen any study to quantify this.
Photosynthesis is a very minor factor. In very rapidly growing species about 1% of the shortwave solar energy is captured. For most plants, it is much less.
The reason why thermal imagery of well vegetated areas comes up cooler is because of water evaporating from the plants, which can cool down an area by over 10C under the right conditions, such as in a rainforest.
Re 32: the main argument against ‘economic science’ (particularly the Chicago school of thought) is that consumers aren’t really rational actors who make choices based on the best information available.
The main argument against ‘climate science’ is that individual weather elements aren’t really predictably acting the way the models assume. Just as in climate science, it should be the power of the models to accurately predict results, rather than whether the axioms involve approximations of behaviour required to actually make calculations that they should be judged on.
The $6 billion/year PR industry is bent on deceiving consumers -and if the foundation is rotten, the whole edifice comes tumbling down.
Consumers can be deceived into both excessively alarmist or skeptic views. The approximation of behaviours of which you are talking about is still working in models to give the most accurate predictions possible. Many economists are incorporating closer approximations of actual behaviours into complex models which may improve predictions further.
for the economic damage of global warming see The Stern Report.
The Stern report is commendable mainly for using best practice in economic analysis. Strange, because in the first paragraph you seem to have disrespected good economic analysis. My criticism of the Stern report is that it hasn’t worked within best practices for climate science. It has taken one of the “Coulds” (Global warming could increase the overall occurrence of extreme weather events such as Hurricanes and droughts) and assumed it to be an accurate consensensus Scientific prediction. A large proportion of the costs mentioned in its *baseline* cost estimate assume this. The assertion that it could increase them by a lot, is used to make its baseline prediction seem like the most probable outcome. It has also seemed to discount possible savings on the less severe winters/frosts front.
“possible savings [due to] less severe winters” Due to a series of ‘less severe’ winters, just north of here the bark beetles have ravaged millions of hectares of forest. Perhaps this year it will be cold enough to kill many of the bark beetles…
Western Coal’s PR “co2 is life” advocates will probably claim that a benevolent nature, grateful that the imprisoned fossil coal is being freed and converted into more highly nutritional carbon dioxide, is now removing the Northern boreal forests to prepare them as wheatfields to reward humanity’s efforts:
I’d like to amplify Isaac Held’s point of a few days ago. While it does seem that the Medieval megadroughts over the American West were associated with – i.e. driven by -a series of remarkably persistent La Nina-like states (regardless of what caused them) – the US Southwest also dries in future climate projections even though models disagree on whether the tropical Pacific becomes more La Nina-like or El Nino-like. I think the dynamics of future subtropical drying are distinct from those associated wth past historical or Medieval droughts. No doubt similar interactions beween eddies and the mean flow are involved in all but the ‘trigger’ is different. For the future the trigger seems to be nothing more than uniform warming which is capable of causing, by various processes, subtropical drying and high latitude moistening.
[Response:Thanks for your comment Richard. No disagreement here at all. In fact, there was some discussion along precisely these lines in the previous RC thread which I linked to. – mike]
“Between 2000 and 2005, emissions grew four times faster than in the preceding 10 years, according to researchers at the Global Carbon Project, a consortium of international researchers. Global growth rates were 0.8% from 1990 to 1999. From 2000 to 2005, they reached 3.2%.” The graph from that study is here (A different news report on the same story listed the 2.5% and 1% numbers). I could not find any mention of the study methods, however (i.e how do they treat deforestation? Are they strictly looking at fossil fuel combustion?).
The relationship between anthropogenic CO2 emissions and the atmospheric concentration of CO2 is tricky, since the fossil-fuel CO2 is feeding into the natural carbon cycle which has many sources and sinks of both chemical and biological natures. Whether or not the increasing rate of anthropogenic CO2 emissions can by itself account for the observed increases in atmospheric CO2 content seems like an important question.
The Mauna Loa Observatory record (at wikipedia) is here
1995 levels were about 358 ppm. That’s an increase of 1.2 ppm/year from 1990 to 1995
2000 levels were about 368 ppm; that’s an increase of 2.0 ppm/year from 1995-2000
2005 levels were around 379 ppm; that’s an increase of 2.2 ppm/year from 2000-2005 (as Grant points out in #33)
Warning: I got those numbers from holding a piece of paper up to the computer screen (the Mauna Loa graph, again). Still, the rate of increase of atmospheric CO2 is itself increasing, which seems to go hand-in-hand with the increased rate of anthropogenic CO2 emissions. That seems like good cause for alarm.
Anyway, thanks to Grant for pointing out my error.
RE#40, you say “The main argument against ‘climate science’ is that individual weather elements aren’t really predictably acting the way the models assume.” This issue of weather-vs-climate has been extensively discussed on realclimate; see for example short-and-simple-arguments-for-why-climate-can-be-predicted. (Note that physics relies on probabilistic arguments for very accurate calculations of all sorts, from quantum mechanics to statistical mechanics – exact determinism is a 19th century concept!) Regarding your calls for a ‘consensus view amongst scientists in laymans terms’, that’s exactly what the IPCC was put together for.
Regarding comments #31 and #34-36, on my comment #25:
Re #31 (Charles Muller): As I stated in comment #25, the CO2 drop OBSERVED in 3 ice cores from 1100-1300 (averaged) to 1600-1800 (averaged) is 6-7 ppm. That number can’t be ‘smoothed down’ to 1.5-2 ppm; it is what it is.
—Any changes sustained for over a century should have reached thermal equilibrium with the climate system in the sense Jim Hansen used in the late 1980’s and early 1990’s, with equilibrium times measured in decades.
—My argument is that the major reforestation that occurred in the century following the Black Death (1350-1450) was soon followed by even more massive reforestation from 1492 into the 1700’s during the American pandemic.
—The often-quoted CO2 lifetime in the atmosphere of 100-150 years is actually a hybrid of two very different responses to imposed changes: a very quick response for the first 50 years, followed by a very slow response for thousands of years afterward. The Joos et al model I used to create the smoothing function I used incorporates all of this.
Re #34 (Blair Dowden): My calculations were meant to estimate only the effect of greenhouse gases. As I understand land-use studies, deforestation causes a net winter cooling at middle-high latitudes by creating surfaces with higher (snow) albedo, but it also causes a net summer warming at low latitudes by reducing evapotranspiration and baking dry, deforested soils.
Most pre-pandemic forest clearance (including the Americas) occurred at low latitudes, because that is where most humans have always lived. So — REforestation of the tropics should have caused a net summer cooling and thereby added to the cooling imposed by falling greenhouse gases. That would have increased the anthropogenic contribution.
Re #35 and #36 (Hank Roberts): Most of the information provided here agrees with the sources I have found. One point to add — only ~7 million out of ~50 million Native Americans lived in the U.S. and Canada, so a U.S.-centric view is only marginally relevant to this story.
Thanks for some interesting and well-based comments on my comment.
you say “The main argument against ‘climate science’ is that individual weather elements aren’t really predictably acting the way the models assume.” This issue of weather-vs-climate has been extensively discussed on realclimate
Yes I agree but I was contrasting your point on economic science against your point on climate science. You disrespect economists for their models assuming predictable elements (humans assumed having certain behaviours) but don’t see that as analogous to Climate science models assuming predictable elements (weather as in temperature wind speed having certain behaviours). The truth is (and believe me I’ve thoroughly read many of those weather vs climate discussions) scientific models pass or fail based on their predictive powers in real situations, not on whether their assumptions are strictly what really happens case by case (human by human or weather event by weather event.)
(Note that physics relies on probabilistic arguments for very accurate calculations of all sorts, from quantum mechanics to statistical mechanics
Both economic and climate science rely on probabilistic arguments for very accurate calculations. Again it is the reliability of the calculations matching measured numbers into the future unknown which is important.
Regarding your calls for a ‘consensus view amongst scientists in laymans terms’, that’s exactly what the IPCC was put together for.
Here I was trying to get the thread back to what it was originally about (Gulf Stream and LIA connection). I cannot find any mention of it in the IPCC as such (maybe I’m not looking hard enough), so I will assert this is their consensus:
It is quite probable that disruptions in sea currents such as the Gulf Stream can cause quite significant and abrupt *regional* changes in temperature (both up and down). It is not quite clear at this stage exactly what causes this and when to expect future disruptions. It will also presumably change the “temperature signature” caused by AGW. How this could affect global average indicators is uncertain, but it is unlikely to nullify current predictions of rising temperatures and sea levels.
#45 Thank you very much Bill for reply and explanations.
I posted antother comment on this subject, but the recent crash made it disappear. It deals with the equivalence 6-7 ppm CO2 > 0,08 °C.
In a recent article on RC, Gavin told us CO2 accounts for approx. 30-40% of modern global warming. So, from PI period, we have 100 ppm CO2 > 0,4*0,8 °C = 0,3°C. That’s for transient sensitivity. For equilibrium sensitivity, assuming 0,6°C in the pipe line (Hansen 2005, but see Lyman 2006), we have 0,3*1,4 °C = 0,4 °C. With such values, and with 6-7 ppm CO2, we would get no more than 0,03 °C. Should we use a very higher climate sensitivity ?
Don’t forget that the impact of increasing CO2 is logarithmic. I get this back-of-the-envelope: CO2 drops from 280 to 273 (a drop of 7ppmv); log[280/273] / log = 0.0365 = logarithmic fraction of a doubling (or in this case, halving) of CO2. Multiply this by climate sensitivity to 2xCO2 to get equilibrium response. Using a sensitivity of 3 deg.C gives me a response of 0.11 deg.C — Ruddiman’s 0.08 would imply a sensitivity of only about 2.2 deg.C/2xCO2
Re #42: The graphic does not due justice to Eastern Washington generally and the Palouse in particular. As the soils of the Palouse are glacial loess, the wheat yields are higher than anywhere in the U.S. (Indeed, due to continued plant breeding programs, higher than anywhere, including the Ukraine, which is similarly blessed with soils but not with plant breeders.)
The Handley Centre outlook for the Pacific Northwest calls for somewhat more precipitation combined with noticably longer dry seasons. This means that the wheat breeders here are already attempting to devise new varieties which will yield well under the new conditions. This will be, they state, very difficult. Establishing a new variety useable as a crop now requires 8 to 15 years with a stable climate. So they appear to be anticipating needing quite a bit longer while the climate is changing. Other than that, they appear to be rather optimistic regarding their chances for success.
So I opine that the Palouse will continue to grow important quantities of dry land, winter wheat. But the yields may have to go down as the wheat breeders produce varieties which put more energy into winter survival, resistance against new diseases and pests, etc.
Response #48 to comment #47 explains my estimate (#45) of a ~0.8C equilibrium temperature response to a 6-7 ppm cooling caused by reforestation following pandemics. I assumed a 2.5C global-mean response to a CO2 doubling sensitivity, and then backed off that estimate by a bit.
I didn’t bother to mention that Ferretti et al (Science, 9 Sept 205) attributed the several tens ppb methane drop near/after 1500 to reduced biomass burning brought on by mortality caused by the American pandemic. That would add to the anthropogenic cooling effect. So my estimate was a minimum.
Re:41 “possible savings [due to] less severe winters” Due to a series of ‘less severe’ winters, just north of here the bark beetles have ravaged millions of hectares of forest. Perhaps this year it will be cold enough to kill many of the bark beetles…
and 42: Chuckle.
Western Coal’s PR “co2 is life” advocates will probably claim that a benevolent nature, grateful that the imprisoned fossil coal is being freed and converted into more highly nutritional carbon dioxide, is now removing the Northern boreal forests to prepare them as wheatfields to reward humanity’s efforts:
Thanks Hank, I always like your comments :). However, I would like to bring up how I believe this fits in to the big picture. We must always mourn the loss of habitat or species, especially when it is felt that it was “preventable”. However, economically speaking, change is both a threat and opportunity in equal measure. This shouldn’t be seen as heartless economic thinking, but as the only way to prioritise action on things humans care about. It should be noted similarly in an economy we should mourn and be sad about businesses going bankrupt and people losing their jobs, but it is unrealistic to “save” them all, and counterproductive to prop up failing businesses or prevent redundancies. If people are willing to put in measurable quantities of money to “save” species or habitats, this could be what is used in economic models to decide the cost to humanity of losing those same things. It is wrong to give them “infinite” value because they can never be again what they were.
This shouldn’t be seen as heartless economic thinking, but as the only way to prioritise action on things humans care about.
I agree with Marco Parigi that objective measures are needed in order that rational prioritization occur. I do not agree that the “only way” to do this is known. Measuring things by market value has the virtue of simplicity, but it is not guaranteed as far as I know to yield any sort of long term optimum. Just because we want to think quantitatively (I agree here) doesn’t necessarily mean we must think financially.
I agree that assigning infinite value makes for a mathematically pathological problem. This is why the “precautionary principle” isn’t a practical guide to policy. Assigning market value to things, on the other hand, is mathematically tractable, but that doesn’t make it appropriate.
Perhaps it is exactly the mismatch between the time scales of the marketplace and of certain issues relating to the common good that makes those issues environmental issues rather than marketplace issues.
In any case, waving this issue away on the grounds that the only rational way to inform policy is through economics is circular reasoning. If “economics” wishes to claim that territory it must include matters other than the marketplace in its domain of discourse.
In any case, waving this issue away on the grounds that the only rational way to inform policy is through economics is circular reasoning. If “economics” wishes to claim that territory it must include matters other than the marketplace in its domain of discourse.
I am not attempting to wave away the issue, and I am not elevating economics to be a basis for *all* policy decisions. I am not sure about what you mean about there being “other” possible numerical value systems useful in making priorities objectively. Call it a “human value quotient”, of which we are trying to maximise for humanity – call the study of it the “science of human values”. The semantics will sound all new-age, life-spiritual and green, but the mathematical calculations would look oddly like those of money and economic science no matter what. Sure – things like future discounting has to be considerably addressed for such long time scales, but there are plenty of economic timescales that deal with such difficulties such as long term demography, the time taken for some kinds of crops/trees to grow, etc.
“a study called â��Going to the Extremes,â�� coming out in the December issue of the journal Climatic Change, researchers from the National Center for Atmospheric Research (NCAR) and Texas Tech University…
“… Just how drastic will the extremes be? The report doesnâ��t give a single answer, but plots out three alternate pictures of the future. The most optimistic of the scenarios assumes rapid introduction of clean, efficient technologies to reduce greenhouse-gas emissions. The least optimistic shows a future where we more or less muddle along as we are todayâ��but with a much larger population consuming resources. The effects vary accordingly. The numbers defy simple translation into lay language, as theyâ��re calculated in what statisticians call â��standard deviationsâ��â��two standard deviations being the amount of variation you would expect under normal conditions. When variability gets up to four or five standard deviations, thatâ��s really big. But consider this: under the worst of the three scenarios, by the year 2099, the length of heat waves will increase by a whopping 12 standard deviationsâ��which is statistician-speak for â��holy cow!â�� â��The lengths are so much longer, itâ��s not even on the same scale as today,â�� says Tebaldi.
I do believe something of the sort is necessary. I am not confident it can be sensibly denominated in dollars. (Also, there is a nontrivial moral question of whether other life forms should have some standing in the calculation.)
The semantics will sound all new-age, life-spiritual and green, but the mathematical calculations would look oddly like those of money and economic science no matter what.
Maybe. I’m trying to think about this.
The biggest concern I have is that economics is growth oriented, and the planet is finite. Also, there is a fundamental difference between the time scales on which marketplace analysis is germane and the time scales on which environmental disruption occur, a fact which I believe is finally being widely recognized. Finally I have a concern which as far as I know is original to me, which is that on long time scales and across alternative scenarios, the value of currency is ill-defined. This leads me to suspect that the long range objective function that we need might be incommensurable with money, and that therefore a great deal of mainstream economic thought is much less relevant to environmental issues than is widely believed.
The growth issue is a key. Optimizing for wealth is equivalent to optimizing for ever-increasing human activity. In a very real physical and biological sense, however, human impact on the planet must asymptote to a finite value on the longest time scales. It also seems likely that we have reached the point in human history where we must begin to grapple with this limitation. This raises the question whether wealth is a useful objective metric on long time scales. If it isn’t, I would think a great deal of economics needs to be reconsidered before it can be applicable to the sorts of long-range planning that are becoming necessary.
This leads me to suspect that the long range objective function that we need might be incommensurable with money, and that therefore a great deal of mainstream economic thought is much less relevant to environmental issues than is widely believed.
The biggest concern I have is that economics is growth oriented, and the planet is finite.
Start by changing the semantics, because at heart, economic scientists truly believe that money is just an objective representation of human values. If you say that the science of human values is concerned with the attainment of an overall increase in these values, you realise that it matters little whether the planet is finite, and you can have “value growth” even with a shrinking physical base. Human value scientists know this, and that is why the Stern report correctly states that gradual, but steadily more stringent tax and carbon trade measures can be setup to have virtually no “human value nullification” in the long run while considerably curtailing emmissions and resource usage.
On your other point re: long time scales, “religious” values seem to do well in intergenerational timelines (societies with solid moral belief systems do better in the really long term). I think this is why the Alarmists’ message is so effective at “converting” people to the cause. ” The end of the world is nigh” is standard religious fare, and although I think fear is a blunt instrument for people to do things that “actually” help, it certainly bumps up the average priority several notches for the issue. It also explains why mildly dissenting climate scientists get attacked by the orthodoxy (oops, I meant those signatory to the consensus) :)
There is a reasonable number of studies that, for example, compare the fortunes of religious societies vs secular ones, the predictable long term effects of demographic changes (eg. baby boomers). The results of those, however are still rarely used in conjunction with other long term predictions (such as climate change)