What is the long term sensitivity to increasing CO2? What, indeed, does long term sensitivity even mean? Jim Hansen and some colleagues (not including me) have a preprint available that claims that it is around 6ºC based on paleo-climate evidence. Since that is significantly larger than the ‘standard’ climate sensitivity we’ve often talked about, it’s worth looking at in more detail.
We need to start with some definitions. Sensitivity is defined as the global mean surface temperature anomaly response to a doubling of CO2 with other boundary conditions staying the same. However, depending on what the boundary conditions include, you can get very different numbers. The standard definition (sometimes called the Charney sensitivity), assumes the land surface, ice sheets and atmospheric composition (chemistry and aerosols) stay the same. Hansen’s long term sensitivity (which might be better described as the Earth System sensitivity) allows all of these to vary and feed back on the temperature response. Indeed, one can imagine a whole range of different sensitivities that could be clearly defined by successively including additional feedbacks. The reason why the Earth System sensitivity might be more appropriate is because that determines the eventual consequences of any particular CO2 stabilization scenario.
Traditionally, the decision to include or exclude a feedback from consideration has been based on the relevant timescales and complexity. The faster a feedback is, the more usual it is to include. Thus, changes in clouds (~hours) or in water vapour (~10 days) are undoubtedly fast and get included as feedbacks in all definitions of the sensitivity. But changes in vegetation (decades to centuries) or in ice sheets (decades(?) to centuries to millennia) are slower and are usually left out. But there are other fast feedbacks that don’t get included in the standard definition for complexity reasons – such as the change in ozone or aerosols (dust and sulphates for instance) which are also affected by patterns of rainfall, water vapour, temperature, soli moisture, transport and clouds (etc.).
Not coincidentally, the Charney sensitivity corresponds exactly to the sensitivity one gets with a standard atmospheric GCM with a simple mixed-layer ocean, while the Earth System sensitivity would correspond to the response in a (as yet non-existent) model that included interactive components for the cryosphere, biosphere, ocean, atmospheric chemistry and aerosols. Intermediate sensitivities could however be assessed using the Earth System models that we do have.
In principal, many of these sensitivities can be deduced from paleo-climate records. What is required is a good enough estimate of the global temperature change and measures of the various forcings. However, there are a few twists in the tale. Firstly, getting ‘good enough’ estimates for global temperatures changes is hard – this has been done well for the last century or so, reasonably for a few centuries earlier, and potentially well enough for the really big changes associated with the glacial-interglacial cycle. While sufficient accuracy in the last few centuries is a couple of tenths of a degree, this is unobtainable for the last glacial maximum or the Pliocene (3 million years ago). However, since the signal is much larger in the earlier periods (many degrees), the signal to noise ratio is similar.
Secondly, although many forcings can be derived from paleo-records (long-lived greenhouse gases from bubbles in the ice cores most notably), many cannot. The distribution of sulphate aerosols even today is somewhat uncertain, and at the last glacial maximum, almost completely unconstrained. This is due in large part to the heterogenity of their distribution and there are similar problems for dust and vegetation. In some sense, it is the availability of suitable forcing records that suggests what kind of sensitivity one can define from the record. A more subtle point is that the ‘efficacy’ of different forcings might vary, especially ones that have very different regional signatures, making it more difficult to add up different terms that might be important at any one time.
Lastly, and by no means leastly, Earth System sensitivity is not stable over geologic time. How much it might vary is very difficult to tell, but for instance, it is clear that from the Pliocene to the Quaternary (the last ~2,5 million years of ice age cycles), the climate has become more sensitive to orbital forcing. It is therefore conceivable (but not proven) that any sensitivity derived from paleo-climate will not (in the end) apply to the future.
We’ve often gone over the Charney sensitivity constraint for the Last Glacial Maximum. There is information about the greenhouse gases (CO2, CH4 and N2O), reconstructions of the ice sheets and vegetation change, and estimates of the dust forcing. A recent estimate of the magnitude of these forcings is around 8 +/- 2 W/m2 (Schneider von Deimling et al, 2006). This implicitly includes other aerosol changes or atmospheric chemistry changes in with the sensitivity (or equivalently, assumes that their changes are negligible). So given a temperature change of about 5 to 6ºC, this gives a Charney sensitivity of around 3ºC (ranging from 1.5 to 6 if you do the uncertainty sums).
Hansen suggests that the dust changes should be considered a fast feedback as well (as could the CH4 changes?) and that certainly makes sense if vegetation changes are included on the feedback side of the equation. Since all of these LGM forcings are the same sign (i.e. they are all positive feedbacks for the long term temperature change), that implies that the Earth System sensitivity must be larger than the Charney sensitivity on these timescales (and for this current geologic period). So far so good.
Hansen’s first estimate of the Earth System sensitivity is based on an assumption that GHG changes over the long term control the amount of ice. That gives a scaling of 6ºC for a doubling of CO2. This is however problematic for two reasons; first most of the power of this relationship is derived from when there were large N. American and European ice sheets. It is quite conceivable that, now that we are left with only Greenland and Antarctica, the sensitivity of the temperature to the ice sheets is less. Secondly, it subsumes the very special nature of orbital forcing – extreme regional and seasonal impacts but very little impact on the global mean radiation. Hansen’s estimate assumes that an overall cooling of the same magnitude of the LGM would produce the same extent of ice sheets that was seen then. It may be the case, but it is not a priori obvious that it must be. Hansen rightly acknowledges these issues, and suggests a second constraint based on longer term changes.
Unfortunately, prior to the ice core record, our knowledge of CO2 changes is much poorer. Thus while it seems likely that CO2 decreased from the Eocene (~50 million years ago) to the Quaternary through variations related to tectonics, the exact magnitude is uncertain. For reasonable values based on the various estimates, Hansen estimates a ~10 W/m2 forcing change over the Cenozoic from this alone (including a temperature-related CH4 change). The calculation in the paper is however a little more subtle. Hansen posits that the long term trend in the deep ocean temperature in the early Cenozoic period (before there was substantial ice) was purely due to CO2 (using the Charney sensitivity). He then plays around with the value of the CO2 concentration at the initiation of the Antarctic ice sheets (around 34 million years ago) to get the best fit with the CO2 reconstructions over the whole period. What he ends up with is a critical value of ~425 ppm for initiation of glaciation. To be sure, this is fraught with uncertainties – in the temperature records, the CO2 reconstructions and the reasonable (but unproven) assumption concerning the dominance of CO2. However, bottom line is that you really don’t need a big change in CO2 to end up with a big change in ice sheet extent, and that hence the Earth System sensitivity is high.
So what does this mean for the future? In the short term, not much. Even if this is all correct, these effects are for eventual changes – that might take centuries or millennia to realise. However, even with the (substantial) uncertainties in the calculations and underlying assumptions, the conclusion that the Earth System sensitivity is greater than the Charney sensitivity is probably robust. And that is a concern for any policy based on a stabilization scenario significantly above where we are now.
157 Responses to "Target CO2"
Ferdinand Engelbeen says
There are a few observations in this which makes the high sensitivity for GHGs rather questionable.
In the first place, GHGs in the pre-industrial past follow orbital/temperature changes, don’t lead. This is clear from the detailed Epica Dome C ice core, where CO2 trends follow the d18O (temperature proxy) trend in the LGM-Holocene transition, with no measurable feedback from increasing CO2 levels.
See here (with thanks to André van den Berg for the graph).
Secondly, orbital changes have their effect mainly in the (sub)tropics and may have a different effect on e.g. cloud cover (which are in reverse correlation with total solar irradiance) than more evenly distributed GHGs. Thus the effect of one W/m2 from GHGs doesn’t necessary equals the effect of one W/m2 from orbital changes…
[Response: There is plenty of evidence showing that the GHG changes contribute to the amplitude of the ice-age cycles. The graph you show simply demonstrates that the deglaciation was complex. Given the impacts of adding 120 meters equivalent of global mean sea level equivalent of freshwater to the system are unlikely to be negligible on ocean circulation and biological activity. Similarly, the different timescales for land biospheric changes and ice sheet changes will almost certainly give a complex transient scenario. That is very interesting, and indeed, largely uncertain, but as a demonstration there is no CO2 impact on climate it falls very short.
As to oribital forcing regionally having a different effect to global mean forcings, I specifically state that above and discuss the implications for Hansen’s argument. – gavin]
A. Simmons says
This paper has been picked up by this morning’s Guardian newspaper here in the UK, which ran it as the front page lead.
That report was in turn picked up by The Register, a well-known UK IT (and related issues) site, which has recently been sullying it’s generally excellent record for debunking popular myths or the hype-du-jour by jumping to entirely the wrong side of the popular debate on AGW.
John Reimann says
I am on an international socials commentary e mail list. We have had discussions on global warming on that list, and I must admit that a few of those on the list have been taken in by the deniers. I have had some very sharp debates with them. In any case, one person posted the following note, which quotes an article from the “Guardian” newspaper. Can somebody explain this for me? Thank you.
“Hansen’s team studied core samples taken from the bottom of the
ocean, which allow C02 levels to be tracked millions of years ago.
They show that when the world began to glaciate at the start of the
Ice age about 35m years ago, the concentration of CO2 in the
atmosphere stood at about 450ppm.”
This makes no sense as it stands in the light of present targets for
CO2 emissions; I read it as saying that we should be aiming for a CO2
concentration which would be the SAME at the start of the last Ice
Age, at the time when the earth started to glaciate.
But then it goes on:
“If you leave us at 450ppm for long enough it will probably melt all
the ice – that’s a sea rise of 75 metres. What we have found is that
the target we have all been aiming for is a disaster – a guaranteed
If 450ppm were compatible with a long, long Ice Age and global
glaciation 35m years ago, how is it that exactly the same level today or in the future would be expected to cause global warming?
Chris Rison says
If someone could give a brief explanation about what an ‘orbital forcing’ is opposed to simply a ‘forcing’ then I’d be very grateful. I’ve got a fair understanding of the latter but don’t understand what the ‘orbital’ part means. Thank you.
[Response: It refers specifically to the changes in the distribution of solar radiation over latitude and season that occur because of variations in the precession, obliquity and eccentricity of the Earth’s orbit. Otherwise known as Milankovitch forcings. – gavin]
Keith Clarke says
“A more subtle point is that the ‘efficacy’ of different forcings, especially ones that have very regional signatures might vary, making it more difficult to add up different terms that might be important at any one time.”
Should this be: A more subtle point is that the ‘efficacy’ of different forcings might vary, especially ones that have very *different* regional signatures, making….? A bit hard to parse as well as a missing word.
[Response: thanks. I made the correction. – gavin]
#3 – I think the language the Guardian uses is a bit confused. If you look at the ice cores, then over the recent ice-age cycles CO2 goes up to a high-point of about 280ppm in the interglacials and is lower than that during the glacial maximums.
I think the 450ppm 35 million years ago is referring to a time *before* the current geologic period of ice-age cycles. What they argue is that if the 450ppm is sustained over geologic timescales ie >1000-10000 years then it will lead to the loss of all ice.
That seems reasonable to me, but I don’t think it has any bearing on policy, since CO2 levels are already above 380ppm and rising rapidly. We have to work out a way of stabilising those levels during this century and *then* we can work out how we’re going to reduce them back down again (or whether the carbon cycle will do it for us quickly enough).
The East Antarctic Ice Sheet is going to give us more than a century of breathing space to sort things out, and if it doesn’t, well, there’s not much we can do to avoid that, so we’d better hope that it does.
pete best says
Re #3 The first x million years of antartic glaciation 34 million years ago was unstable and did not become stable until 18 million years ago when CO2 levels had fallen slightly lower I believe. worryingly Northern Hemisphere Ice sheets did not begin to form until 400 ppmv but I do not know when they became stable CO2 levels wise. If we take into account the differences in orbital forcings 34 million years ago and the levels of other greenhouse gases and the amount of vegetation on the planet I would agree with James Hansen that climate sensitivity is possibly higher than it was way bacj then.
Blair Dowden says
The six degree long term climate sensitivity seems to largely depend on the CO2 level when Antarctica began to be glaciated about 35 million years ago. I am having trouble understanding how a 425 ppm level of CO2 at that time was arrived at. From looking at the curves, it does not appear any of them really fits very well, so this data is not adequate to arrive at such a conclusion.
I also want to question the assumption that the temperature (thus CO2 level) at which glaciation begins is the same one that will melt the ice cap entirely. An ice cap creates it own regional climate, as can be seen by the contrast between Antarctica and the Arctic. Therefore I would assume that the temperature required to melt an ice cap is significantly higher than the temperature it formed at.
On the other hand, this view seems to contradict the observation that the warming period for interglacial periods is shorter than the cooling period of re-entering the ice age.
“I would assume that the temperature required to melt an ice cap is significantly higher than the temperature it formed at.”
I believe this is the state the Greenland ice cap is in now – ie it is only there because it is already there, and if it melts there’s no way to get it back without another ice age. Indeed, if it starts to melt, then presumably its height will decrease and the melting will accelerate.
So, yes, I think I would now agree with you that it is a bit of a leap of faith to assume that 450 ppm would see the eventual demise of the Antarctic Ice Sheet.
Nick Gotts says
“So what does this mean for the future? In the short term, not very much”.
Well, if Hansen is right, that depends on what you mean by “short-term”. To quote from the preprint:
“Sea level changes of several meters per century occur in
the paleoclimate record, in response to forcings slower and weaker than the present human-made forcing. It seems likely that large ice sheet response will occur within centuries, if human-made forcings continue to increase. Once ice sheet disintegration is underway, decadal changes of sea level may be substantial.”
The Guardian article A. Simmons referred to quotes him as saying: “We are talking about a sea-level rise of at least a couple of metres this century.”
So far as I can see, he could be right about that, without being right about the long-term/Earth System climate sensitivity being as high as 6 degrees C. He’s suggesting that previous ice-sheet melting episodes were slow because the forcing due to orbital change was slow, not due to inherent inertia, and that recent observation suggests ice-sheets can melt much faster if the forcing is faster.
[Response: I wasn’t talking about sea level – but just to make it clear, I share Jim’s concern that serious sea level changes are a real possibility. None of the long term sensitivity issues deal with the problem of rates of change, and so the preprint or this post cannot really address that. – gavin]
Jim Bullis says
Do you mean by, “simple mixed layer ocean” that the variations of ocean temperature with depth are not part of the analysis?
[Response: In the standard estimate of the Charney sensitivity. no. Using fully coupled OAGCMs takes much much longer and has not yet become standard practice. In the GISS models, the difference in eventual temperatures (after hundreds of years) is on the order of a few tenths of a degree. – gavin]
Arthur Smith says
On the issue of the orbital forcing changes though – Hansen isn’t even looking at the forcing going all the way back to the orbital effects in this paper, he’s just looking at the CO2 levels as if they are the forcing. The actual forcing from the orbital changes is presumably smaller (or is it effectively large when you account for latitudinal dependence?) – i.e. if CO2 levels themselves are a positive feedback of a factor 3 or so on the orbital forcings, then doesn’t that further multiply the actual sensitivity to fundamental forcing by a factor of 3, i.e. 18 C instead of 6?
The point is: it’s widely recognized that CO2 levels in the atmosphere increase as a result of temperature increases, a positive feedback effect. Climate models and Charney analysis effectively ignore this because they’re looking at the effect of increased CO2 itself.
But if our interest is in the effect of *anthropogenic* CO2 only, which is a real forcing, then the feedback CO2 could be tremendously additive. I.e. if you look at the target for CO2 emissions by humans, rather than just the target CO2 level in the atmosphere, the constraint is that much more stringent. To keep to 350 ppm, which already means a long-term warmer world, we may have to go to zero or less-than-zero human emission levels. But I’d like to see that quantified a bit better…
“If 450ppm were compatible with a long, long Ice Age and global
glaciation 35m years ago, how is it that exactly the same level today or in the future would be expected to cause global warming?”
Prior 35m years ago the earth did not have large ice sheets like today. As the temperature sinks below a special value the Antarctic ice sheet growth.
If the temperature rise in the future above a similar (but somewhat higher value, see post#8) value for a long time, the ice sheet will melt.
The sun is now somewhat brighter the 35m ago, which lowers the CO2 level needed to reach this temperature.
On the other hand the higher albedo of the ice sheets will rise the level of CO2 needed to this temperature.
The problem is, that the exact temperature is not well known.
I am a little confused. You mention that slower feedbacks are excluded from definitions of sensitivity, which is reasonable; sensitivity must be defined on a relevant time scale.
However, does this mean that those slower feedbacks are excluded from coupled model formulations? I thought I’d seen mentions of afforestation or predictive vegetation in the IPCC report.
Could somebody address the sign, magnitude and time scale of feedbacks related to vegetation changes? I’d imagine that increased vegetation would result in a larger carbon sink, but there would be changes in surface albedo and transpiration along with it.
[Response: Most GCMs do not consider vegetation feedbacks (and all ‘sensitivity’ numbers you will have seen, do not include them either) – but it is becoming more common and most groups have, or are working on, such modules. Some of the early experiments (by Peter Cox, Pierre Freidlingstein etc.) show positive feedbacks on CO2 levels (of varying degrees) and, presumably, some impact on temperature due to albedo/ET effects. I am not however aware of what they are (maybe someone more knowledgeable could comment?). – gavin]
Chris Dudley says
While I appreciate your effort to keep the long millennial timescale in consideration, I think it worth stressing that Hansen et al. have been looking hard at the timescale for the response of icesheets and find evidence that it is centuries rather than millennia. And they stress this also now:
If so, then slow feedbacks may not be all that slow and the present day policy implications rather large.
It is worth noting that the preprint calls for stabilization below the current concentration of CO2 in the atmosphere and for active measures to remove CO2 from the atmosphere while providing a senario for how this might happen.
If if a policy of adaptation rather than mitigation is pursued, measures must also be taken now for projects that require 100 year plus planning horizons such as building dykes or sea walls or siting nuclear reactors. And, without a firm committment to mitigation, adaptation measures can’t be left unconsidered, doubling or more the present day cost of response. For example, the NRC is beginning to entertain an application for a new reactor at Calvert Cliffs on the Chesapeake Bay. This might be considered an aspect of mitigation, but it would be foolhardy not to consider what impact 5 meters of sea level rise before the planned end of decommissioning might have and build in adaptation measures now such as siting the plant further inland where the foundation would not be subject to erosion. A dyke to protect Long Island and it’s neighbors might need its foundation laid within a decade and its cost planned for in tax policy within a few years. Transportation planning also has long term aspects such as the interstate highway system. How soon must we start to elevate I-10 even higher to get the job done in time and at what cost?
So, I would argue that in the short term this work means quite a lot.
[Response: Please don’t misunderstand me – the eventual long term temperature change being 6 deg is the thing that I’m not sure has short term implications. But that is a statement about how changes in T affect ice sheets (etc.) which then affect T again. That is very different from saying that T affects ice sheets (and sea level) which is a much more serious short term issue and would be even if the long term sensitivity was the same as the Charney one. – gavin]
John E. Pearson says
I am no expert in this and don’t know much about terrestrial ecosystems except that they are generally considered to be substantially more complex than ocean ecosystems. I’ve read a few papers on ocean ecosystem modeling and would argue that our understanding of the crucial processes is still quite primitive. For example one ocean ecosystem model that I know is used in at least one GCM does not include any sort of parameterization related to ocean acidity. If large scale changes in the ocean ecology occur because of acidification the model cannot reasonably be expected to capture the effects. Moreover the key organisms and the dynamics between them is poorly understood at best.
I would also argue that the state of our understanding of the carbon cycle as it operates now is still pretty uncertain. There were papers published last year on the missing sink (the gigaton of CO2 that is taken up by unknown processes) like this one: Stephens, B. et al. Science 316, 1732–1735 (2007).
Jim Bullis says
Re 11 Thanks gavin,
So we know that there is no accounting for a controlling process whereby an increase in ocean surface temperature causes an increase in storm activity. Hurricane formation is a particularly strong example of this. That increase in storm activity causes mixing of the ocean such that colder waters from deeper regions are brought up such that the “mixed layer” is cooled. This reverses the initial temperature increase of the ocean surface. There is abundant cold water to handle this for a long time.
If this is the governing process, global warming will be seen as a very slight increase in ocean surface temperature. Only over a long time will there be a measurable increase in surface temperature. Before that happens, there should be a measurable increase in the “mixed layer depth”.
Since deeper waters will be warmer, there is a possible link to the global ocean circulating currents that results in warmer water in polar regions. This could be related to loss of ice, especially ice that is in ocean contact.
It seems that a quantitative study might result in better understanding of temperature changes. It does not appear that this is being done. From what you say, it would be prohibitively time consuming?
I’ve been keen to read the Real Climate response to Hansen’s paper for a couple of weeks, and thanks very much for it. I suppose, however, that I remain confused about the implications, even having read Gavin’s response to #15. Please forgive me if I am slow!
The battle against climate change is based in no small part because of the alaming scientific understanding that we are potentially beginning long-term future processes that cannot be stopped. Short term concerns drive the headlines and insurance companies (and have the most emotive impact), but of course they are not the whole story. The bottom line is – what does the best science now suggest for a target that can be considered safe with reasonable probability? Is 450ppm becoming an inherently unsafe target for a long term habitable Earth? Is 350ppm the new 450ppm? Does the best advice to policymakers need an urgent review?
[Response: Nobody can say what is safe (in the long term), though we aren’t talking about making Earth inhabitable, just seriously uncomfortable for it’s existing inhabitants! As the levels rise, the consequences will get worse, and within those consequences will be dozens of thresholds and tipping points related to local or regional ecosystems and climate regimes – some which could have global consequences. We don’t know exactly where they are or what their effects will be, so the only prudent course is to try and avoid as many as possible. Thus in my opinion emissions should be lowered as fast and as soon as is technologically and economically feasible. Setting a target now doesn’t imply that people will not revise it downward later on in the process and so I don’t tend to get hung up on a precise target. – gavin]
Ray Ladbury says
Jim #17, I think you are jumping to a lot of conclusions here. First, storms–even Cat 5 hurricanes are local and short-lived processes, whereas greenhouse forcing is operative 24/7/365. What is more, while such processes could well be important in the Southern Hemisphere, they will be less so in the Northern Hemisphere. Third, unless storm intensity increases drastically, the storms will foster exchange only down to a certain depth, and as waters at this depth warm, the effectiveness of this mechanism will decrease. Moreover, CO2 solubility will decrease. Thus, one could argue that the mechanism you posit is in fact hiding the warming that is going on, and that in the future (and CO2 persists for hundreds to thousands of years), the warming will be much more severe. In short, given that we already see significant warming occurring, I don’t think we can count on the oceans to save us.
Richard Pauli says
We might want to know that the discussion is moving to the courts:
Comer, et al. v. Murphy Oil USA, et al., in which a class of Mississippi property owners blame the environmental impact of energy and chemical companies for Hurricane Katrina damage. The case was dismissed during summary judgment in Mississippi federal court, but has since been appealed to the U.S. Court of Appeals for the 5th Circuit.
Lynn Vincentnathan says
RE #3 &
That 450 ppm also jarred me and my initial thought was the same as yours. Then I started thinking (aside from the points already raised), the earth was already in an initial cooling phase from some higher global temp. As denialists so like to point out, CO2 follows temp, and so when the temp (initially triggered by reduced solar irradiation??) lowered to that point at which ice sheets start to form, the CO2 was at 450 ppm. I understand from Lovelock’s REVENGE OF GAIA that plants & life in general do much better in cooler climates, so the plants started flourishing more and absorbing CO2 once the temp started decreasing from its high point, and the ice & snow increased the albedo and reduced temp more, and plants liked it even more with reduced temp from lower CO2 & increasing albedo, and the earth kept getting cooler with these cooling positive feedbacks (or swinging up and down, but on the whole headed down). I’m sort of imagining it as a direction and momentum thing — in the cooling direction.
Now we are adding CO2 so the effect is a warming trend, which reduces ice and snow albedo and releases carbon from frozen permafrost and ocean hydrates, and the direction and momentum are in the warming direction. And I think plants don’t very hot temps as much, especially the land being dissicated from the warmer air sucking out the moisture, and harm from droughts, floods, heat stress.
So perhaps there’s this issue of global temperature direction (cooling or warming) and system momentum.
Maybe that’s what “sensitivity” really has to do with — how much you have to poke the dragon to wake him from his sleep. Very little, it seems.
Blair Dowden says
I do not understand the second part of this statement from the Hansen paper:
First, I thought a warming climate reduced the temperature difference between the equator and poles, which is what drives most of the winds and ocean currents that cause ocean mixing. I would not think a small increase in hurricanes would compensate. In the deep past it is during warmer climates that you get a stratified and stagnant ocean.
Second, I did not think there was an excess of CO2 in the deep ocean, rather it would be in equilibrium with the lower CO2 levels of the past few million years. I would think that today’s higher CO2 levels would eventually increase levels in the deep ocean, though of course not quickly enough to solve our problems here on the surface.
David B. Benson says
Arthur Smith (12) wrote “To keep to 350 ppm, which already means a long-term warmer world, we may have to go to zero or less-than-zero human emission levels.” The world is already at 385 ppm and adding 1–2 ppm per year. So to reach 350 ppm from just the current level requires removing about 185 GtC from the active carbon cycle.
Proposed methods to accomplish this include burning biofuels with CCS, as well as direct sequestration from the atmosphere, considered in a previous thread (down two).
And just briefly, without the reasoning, I suggest that even just 315 ppm is too high for long term climate stability.
Chris Dudley says
Gavin (in #15),
I think I see your distinction, we must wait for bare earth in the Antarctic for it to warm the Antarctic through albedo change, but since ice sheets have been grouped in as slow, I would still say that slow may not be so slow. And, this has a large effect. The total amplitude of the sensitivity revises the stabilization target downward, but a not so slow slow feedback also limits the amount of time we can be above that target. So, according to the preprint, we need to be cutting emissions very quickly now to avoid extra effort in cleaning up the atmosphere that would be the result of delay. And, we must perform that cleanup fairly quickly as well to reach the stabilization target soon. Letting the ecosystem do it on its own is not fast enough. If we had thousands of years for the slow feedbacks to come into effect, we might just wait, after cutting emissons, for the atmospheric concentration to fall just as we are doing with CFCs, but with a faster slow feedback, this would not be an option. So, at least for planning, the preprint has two large present day effects: a target below the current concentration and a need to take positive action beyond cutting emissions to meet that target.
Interesting read, but…
We tend to focus in singular stuff instead of the whole. Sure, discussing CO2 is good, but consider some of the causes, say factories. What are they needed for? A consumption lifestyle. For whom? 6 billion people, who grow and consume more each day. Their lifestyle creates alienating neighbourhoods where we adapt to having cars etc. Can you walk 12 miles to the hospital? Maybe overpopulation is a legitimate concern, since it is not just consumption, but the scale of it, shaping our eco-system and society in a manner dependent on oil/consumption.
For a discussion on this, see this interview: http://www.corrupt.org/act/interviews/john_feeney
“Earth System sensitivity is not stable over geologic time. How much it might vary is very difficult to tell, but for instance, it is clear that from the Pliocene to the Quaternary (the last ~2,5 million years of ice age cycles), the climate has become more sensitive to orbital forcing.”
i seen to recall GCMs that attempted to model the closing and opening of ocean channels such as the Drake passage between Antarctica and S. America.
are there any (loooong range)models that take continental drift into account ?
tanx for another good column
Alastair McDonald says
Re #3 I have an explanation of what was meant by the glaciation starting 35 million years ago. That was when ice first started appearing on the mountain tops of Antarctica.
It was only about 3 million years ago when ice sheets started forming in the Northern Hemisphere. It is their waxing and waning that produces the glacial/interglacials driven by Milankovitch cycles, which taken together are called the Great Ice Age.
In other words ice started forming in Antarctica 35 Ma ago, but the Ice Age did not start until the Isthmus of Panama closed 3 Ma ago, and ice sheets started forming in the Northern Hemisphere.
The level of CO2 at the start of the present inter-glacial 10 ka ago (the Holocene) was 260 ppm, and most of the Laurentian and Scandanavian ice sheets disappeared. See the Antarctic ice cores here for the CO2 level:
During the previous inter-glacial (the Eemian) at around 130 ka ago the level of CO2 was at around 290 ppm and the Greenland ice began to melt as well, raising the sea level to about 5 m higher than today.
Now, with CO2 at 380 ppm both Greenland and the Antarctic Peninsula are melting, but not the Antarctic mountain ice sheets.
If all of Greenland melts we get a rise of 7 metres (20 feet). If ALL of Antarctica goes we get a rise of 70 m (200 feet, but CO2 levels are not high enough for that to happen yet!
Alastair McDonald says
Y you wrote “So what does this mean for the future? In the short term, not much. Even if this is all correct, these effects are for eventual changes – that might take centuries or millennia to realise.”
It seems to me that there is a major flaw when thinking about ice sheets, which means an imminent danger is being ignored. Only the melting of continental ice sheets is being considered as a serious threat, because only they raise sea level. If the Arctic sea ice disappears sea level will change by only a few millimetres, and s it is not considered a danger.
But it will have a large effect on albedo! And that seems to be what this thread is about. Roughly a third of the multi-year Arctic sea ice disappeared last, and some Russians are predicting it will all be gone in another couple of years time. Whereas the greenhouse effect only increase with the log of gas concentration, albedo has a linear effect.
If melting ice sheets are a danger because of their effect on albedo, then the melting Arctic sea ice sheet rather belies your claim of what this means for the future when you wrote “In the short term, not much.”
Jim Bullis says
Thanks for the response. I am trying not to come to any conclusions, and intended my words to be looking for comment.
But to clarify, I used the hurricane as a strong example, not meaning to exclude general wind processes.
Are you saying that ocean circulation patterns in the Northern Hemisphere do not reach Antarctic regions where ice is more in the water?
Also, why would CO2 solubility decrease if surface water warmed slightly and deeper water warmed more?
My premise is indeed that storms will foster exchange down to a depth such that the effectiveness of this mechanism will decrease, but then the level of storm activity would increase until the exchange went deeper. And yes, this would be hiding the warming that is going on. So I am suggesting that, maybe, without this process the ocean surface would be warmer than it is. I realize that the ARGO data discussed elsewhere is said to have been misinterpreted, but the actual numbers for surface temperature are not changing that much.
I also think I have heard that Antarctic ice is breaking up faster than expected.
Now back to hurricanes. Yes they are local, and short-lived, but they have powerful mixing effects that have a long lasting effect. We have actual knowledge of the strong deep currents in the gulf due to Katrina, from the fact that ocean floor pipelines were seriously damaged. I believe they were in water 1500 meters deep. The impact of this, via the gulf stream, would be spread over the entire North Atlantic, and it would last there quite some time.
But my point of view goes back to the practice of indirectly measuring the thermocline with a velocimeter as a part of underwater sound studies. This was the best way to relate temperatures to underwater sound propagation effects. More directly, Navy sonar operations used to include, and probably still do, a bathythermograph scan, “BT”, as a part of understanding how to best operate the sonar in a particular region to detect submarines, and the key depth of increased safety was the depth of the “layer.” The layer depth is thus known very well to vary significantly with wind effects, much less in strength than hurricanes. And we are talking about depths around 70 meters. That number is the transition between mixed conditions and rapidly cooling water depths. Below 600 meters, the oceans are almost universally cooler than 5 degrees C. About 85% of the oceans are 3000 to 6000 meters in depth. So we know very well that the reservoir of cold is large.
But my premise is not that the oceans will save us. If taken into account, it may be seen that they will slow the process as far as ocean surface temperature goes, and hence general air temperature. Further, the process I describe, while making surface temperature slow to change, could relate to ice melting activity.
But my premise is not that the oceans will save us. If taken into account, it may be seen that they will slow the process as far as ocean surface temperature goes, and hence general air temperature. Further, the process I describe, while making surface temperature slow to change, could relate to ice melting activity.
Eventually, as the depth necessary to tap into the cold gets lower, the storms will increase. Without benefit of analysis, I venture to guess this would be noticeable much sooner than a hundred thousand years. However, it might not be as soon as some are now saying.
Lynn Vincentnathan says
RE #22, & the quote from Hansen’s study, “The most direct GHG feedback is release of CO2 by the ocean due to temperature dependence of CO2 solubility and increased ocean mixing in a warmer climate that flushes the deep ocean of CO2 that accumulated through biologic and physical processes.”
When I read it I had in mind the frozen methane hydrates at the bottom of the ocean (from “biologic processes”). But I could be wrong. That’s what it seemed like to me with my limited understanding.
Also, do warmer oceans tend to turn the CH4 that is being released from the hydrates into CO2? And, if so, wouldn’t that be actually a bit better than the ocean releasing CH4 into the atmosphere where it is a much more potent GHG for the ~10 years until it decomposes into CO2.
Then, there is the issue to anoxic warming oceans turning this into hydrogen sulfide and poisoning life on earth. I imagine that might be a possibility in the distant future if we don’t mitigate GW enough. But that wouldn’t be a positive feedback, I don’t think, just a nasty effect of GW.
Mark A. York says
I don’t know. Can you refute a man’s science if you can’t spell his name?
George Darroch says
Just to clarify, are the figures being used in this post and Hansen et al C02 ppm, or C02 equivalent (GHG forcings)?
pete best says
I cannot at present see a major issue with Hansens analysis but I see big problem with politicians being able to swallow it. In recent talks I have seen him do he speaks of leaving coal in the ground or using CCS technology within a decade but looking at CCS technology it is either in development for new power plants or it has to be retrofitted at some point to existing power plants sooner which is unlikely to happen.
He then states that existing oil and gas reserves can be used so long as we leave unconventional oil alone. Anyone been to Athabasca recently? The oil sands there are being developed at a frantic pace, 1 mb/d now rising to 5 mb/d come 2020 and other shale and heavy oils are likely to be developed also once conventional oil begins to peak come 2015 as the head of shell announced recently.
If we really want to max out our Co2 levels at 450 ppmv (oil and gas usage only) then we has better really start compaigning now. I am sure that the world can do it but its just that lead times are so long for R&D, prototyping, and full commercial release that its doubtful that any significant dent in CO2 emissions can be made for 30 years.
I don’t understand Hansen’s assumption.
How conciliate the global temperature driving by CO2 and the lag between Antarctic temperature and CO2?
The initial “forcing” for glaciations isn’t, as you know, CO2, but regional summer insolation forcing, even if the global forcing is weak.
So can you explain, again, why Hansen considers only GHG in his sensitivity calculation?
1) On a 6degC climate sensitivity in policy relevant timescales or not, having read this paper twice – I am very very unconvinced.
I’m still reading this para (Page 4) as a gaff:
“Paleoclimate data permit evaluation of long-term sensitivity to specified GHG change. We
only assume that the area of ice is a function of global temperature. Plotting GHG forcing (7)
from ice core data (27) against temperature shows that global climate sensitivity including the
slow surface albedo feedback is 1.5°C per W/m2 or 6°C for doubled CO2 (Fig. 2), twice as large
as the Charney fast-feedback sensitivity.”
I still see roughly equal contributions from GHG and ice sheet albedo (fig 1b), i.e. 1+1=2. And fig 2 after time=0, I see what is in effect a double count of forcing because the GHG alone seems to be equated with the full 3degC excursion, so because there aren’t the extensive ice sheets now 6 degC/2 = Charney’s ~3degC.
I still subscribe to a 3degC sensitvity, but am willing to be persuaded.
2) On SLR, since Hansen’s earlier paper where his team looked at the seasonal insolation peak correlations with ice-volume changes, I am now concerned (previously I didn’t count SLR as a major policy timescale threat). That’s especially the case given the Arctic and the potential for secondary impacts on Greenland and news from the Antarctic.
I asked William Connolley about this over at Stoat, as he modelled the impacts of the loss of Arctic sea ice a while back; his answer: “Well, it would lower the albedo, though perhaps not by as much as you might expect due to clouds and sun angle. Our study found little *long term* impact, because the ice largely regrew each winter. That included albedo effects.”
Hope that’s of interest.
pete best says
This article appears to demonstrate that WAIS has been losing mass for a long time at around 2 cm per year but recently it has jumped to 1.6 M per year, a significant increase. And this is with only 0.8C of GW. With 0.6C in the pipeline from ocean latency and another 0.4 from existing infrastructure already in place I really fear for the ice sheets flow rates into the oceans. If we are releasing GHG’s at 30x any known historical rate and 1.6 M become 2 or 2.5 M per year then things are only going to accelerate for the next 60 years at least even if we stop all CO2 emissions significantly in 20 years time.
Andy Revkin says
Just so I’m sure I understand, it seems you’re basically saying that the “resting state” of climate in the long run after doubling of pre-industrial CO2 is likely warmer than the shorter-term (100 years or so) response of the system. But you’re also saying that uncertainties in feedbacks etc are very large (perhaps an understatement, actually).
That would seem to leave high long-term sensitivity in the realm of what Princeton’s Steve Pacala has called “the monsters behind the door” — possible, but not necessarily clear threats. That’s different than coming to a conclusion that 350 ppm is an appropriate long-term limit for CO2.
For us non-scientists, does this question, because of the uncertainties, still rest in that simple precautionary place (“do no harm”) or has this paper truly narrowed the scope of what’s possible from the human greenhouse intervention?
[Response: The key point to take home from this study is that the initiation of glaciation likely didn’t happen at 550 ppm or 650 ppm or similar – it was likely less. That implies that if you wait long enough with levels that high, large scale glaciation is probably unsustainable. Undefined in all of this is what ‘long enough’ means, and whether you can get a better lower bound. Those are unfortunately exactly the issues that matter in the short term. I think that it is difficult to rule out definitively that it is as low as 425 ppm or so. However, as I stated above, initial targets that are being set now are not set in stone and are not yet really being worked towards. So does this add to pressure to move ahead on emission reductions? Yes, but there is no shortage of reasons to be working on that already. – gavin]
[Response: We should also recall the implications of Dave Archer’s “long tail” of anthropogenic CO2. Something like 20% of the anthropogenic CO2 will stick around for much more than 1000 years. That tells us something about how much peak CO2 we can tolerate without committing the planet to extreme consequence of the very long term warming. For example, if we reached a peak CO2 of 1200 ppm, then the long term tail contains about 280 + (1200-280)/5 or 464 ppm CO2. That would mean that even if the deglaciation took a very long time to set in, the CO2 would indeed stay above the threshold level for a long time. This is only a crude illustrative calculation, and to do it right would take a proper ocean carbon cycle model, but I hope it gives the idea of what Hansen’s calculation implies even if the catastrophic changes require a very long time to act. –raypierre]
[Response: I’m very pleased I can disagree with Gavin for once (this is hard). The CO2 level at which glaciation started (in a phase of declining CO2 concentration) is very likely not the same as the CO2 level where the ice sheets will vanish again (in a situation of rising CO2 concentration) – the latter very likely is quite a bit higher. This is because of the positive ice albedo feedback which leads to hysteresis – the basic concept is well known since way back (I think Budyko) and this is also found in coupled climate – ice sheet models like our CLIMBER-2 model – see Calov and Ganopolski 2005. -stefan]
Blair Dowden says
I would like to look at this statement from the Hansen paper:
The calculation above is for a long period of time and includes all the slow feedbacks, so I do not know why Hansen calls it a fast feedback. Since the greenhouse gas portion is about half of the total forcing (ie. 1 + 1 = 2), if you consider them the cause (or at least the main feedback) then you get a GHG sensitivity of about 6 degrees. With ice sheets pushing below 45 degrees of latitude, this looks close to a snowball-Earth condition which has a high climate sensitivity. Luckily the same thing was not also happening in Asia, or we might no be here to write about it. A look at the increasing temperature response to the same orbital forcings as average temperature dropped shows that climate sensitivity became significantly larger during the Pleistocene ice ages.
I am surprised that Hansen did not use the Pliocene (about 3 million years ago) as a benchmark. Here we have CO2 levels around 400 ppm, global average temperature about 2 or 3 degrees higher, and sea levels 25 to 35 meters higher (think ten story building). The carbon dioxide forcing is about the same ( 280 ppm Interglacial / 180 ppm LGM is close to 400 ppm Pliocene / 280 ppm Interglacial ) for a temperature change about half as much, implying a much lower climate sensitivity, closer to three degrees rather than six for the period we are about to enter.
I still do not think you can use the temperature (or CO2 level) for the initiation of glaciation to be the same as that sufficient to melt the ice cap. For a large continent like Antarctica I think it will take a few extra degrees to overcome the thermal inertial of all that ice, unless someone can demonstrate why this is wrong.
#22 Blair Dowden,
This might help:
“Carbon dioxide is more soluble in cold water, so at high latitudes where surface cooling occurs, carbon dioxide laden water sinks to the deep ocean and becomes part of the deep ocean circulation “conveyor belt”, where it stays for hundreds of years. Eventually mixing brings the water back to the surface at the opposite end of the conveyor belt in regions distant from where the carbon dioxide was first absorbed, e.g., the tropics. In the tropical regions, warm waters cannot retain as much carbon dioxide and carbon dioxide is transferred back into the atmosphere” http://science.hq.nasa.gov/oceans/system/carbon.html
Scroll down to the first bullet point.
With regards stronger winds in a warming planet, see Toggweiler & Russell “Ocean Circulation in a Warming Climate.” Nature 17/1/08. That suggests a role for vertical temperature gradients in strengthening winds, such vertical gradients should increase with more global warming. Therefore regionally stronger winds could occur, and in the Southern Ocean that can impact overturning/mixing through Ekman Transport, the Southern Ocean is big! (I wonder if such enhanced wind driven mising would in effect be a negative feedback on warming rates, enhaning ocean thermal damping.)
#32 George Darroch,
Where people don’t put units, you can safely assume that if the figure’s in the hundreds it’s probably atmospheric concentration in parts per million, where it’s below or around 10 it’s probably Watts per square metre.Otherwise concentration is “ppm” and forcing “Wm2”.
Why not read the actual paper?
Hansen’s team explain why they consider CO2 to be reasonably considered the primary forcing where they do so. But they also consider the other primary forcing as albedo changes.
#36 Outeast, thanks for that.
Hank Roberts says
Andy Revkin, think about doubling times in other areas. Pondweed on a lake. Mussels in cooling water intakes. Scum in a clean water system. Radiation in a fission pile. Epidemic spread compared to background existence of a pathogen.
Those are all simple compared to climate. All are places where increasing some kind of moderating influence early on prevents a world of trouble by impacting the doubling time, lessening or halting the growth of the problem.
We have lots of experience with this. We know we don’t notice problems until it’s almost too late — and the more profitable the system externalizing a cost, the more pressure not to notice those costs.
This might be our final examination.
John Lang says
Antarctica has been glaciated several different periods over the last 600 million years. It didn’t just start 34 million years ago, that is just the start of the latest episode of glaciation.
Antarctica was likely glaciated 580 million years ago, 450 million years ago, 300 million years ago and 150 million years ago. CO2 levels could have been as high 3000 ppm to 5000 ppm in some of those different periods.
Antarctica is an unusual case in plate techtonics since continental drift has placed Antarctica around the south pole for most of the past 600 million years including various times when it was locked together with other continental plates.
So, I don’t think the 6.0C sensitivity estimate is robust throughout the entire geologic record.
If you say:
“Hansen’s team explain why they consider CO2 to be reasonably considered the primary forcing where they do so. But they also consider the other primary forcing as albedo changes.”
there is 2 forcings and their sum is 3.5 + 3.0W/m2 = 6.5W/m2
so the sensitivity is 5°C/6.5W/m2 = about 0.75 °C.m2/W
No, Hansen consider CO2 forcing alone, to find about 1.5°C.m2/W.
My question was: why to consider CO2 alone?
But I think Hansen uses the particuliar case of glaciation to know the impact of ice-sheet as a slow feedback.
He assumes in his paper that ice-sheet feedback depends only of temperature and global forcing.
And the alone real global forcing is GHG, even in the case of Pleiostene.
This is what I understand now.
Daniel C. Goodwin says
from Gavin’s article:
“… it is clear that from the Pliocene to the Quaternary … the climate has become more sensitive to orbital forcing. It is therefore conceivable (but not proven) that any sensitivity derived from paleo-climate will not (in the end) apply to the future.”
This statement seems to permit opposite interpretations:
1) The earth has gradually become more sensitive to orbital forcings. This could indicate a greater overall sensitivity, and thus a tendency for paleoclimate-based projections to underestimate the current sensitivity.
2) Because the earth is more sensitive to orbital forcings, and because orbital forcings are headed in the cooler direction, paleoclimate-based projections could overestimate the sum of forcings we’re likely to experience.
Are either or both of these readings in error?
Chris Dudley says
I think the the precautionary level is 280 ppm. What this preprint (and it is a preprint, not a paper) is arguing is that the sensitivity is larger than has been assumed up until now and so a “safe” temperature (a la Exeter) corresponds to a lower concentration (350 rather than 450). It does so so much that a scenario is developed to show how 350 ppm might be achieved (fig. 6). So, the preprint takes the risk quite seriously. We don’t know what it will be once it is a paper. Hansen has a habit of being right, but there may be some flaw in the analysis that a referee catches.
To some extent, the argument has already been published in Pilosophical Transactions. There is more detail here though.
Gavin (also #38),
I think that we are working up to meeting targets. Some countries will meet Kyoto compliance, though if they do this through biofuels, this will be a mirage. So, there is some progress. Your point that all we need to know for sure right now is the direction (emissions cuts) makes some sense, but the manner in which we cut emissions might be influenced by the present work. Greater efficiency cuts emissions, but it does not leave energy to do sequestration while shifting energy sources might make a difference. Solar power is expected to be cheaper than coal in less than a decade, but biochar may never be as an energy source. Yet, we might start planning to absorb the opportunity cost and promote biochar production in order to make progress on sequstration even though it might slow emissions cuts a little. Right now, that would mean supporting research into biochar, its effectiveness in sequestration and its utility as an energy source.
Sorry, I had thought you were talking about the Cenozoic.
I think Hansen is only fixing on CO2 because that is the major issue the paper addresses. There are statements throughout the paper about other mainly albedo based changes, and there’s reference to methane with regards the PETM (page25).
#44 Daniel Goodwin
The variation in climate sensitivity could in theory be higher or lower. From what I read (and I am only an amateur) orbital forcings are too slow to be a factor in sub-centennial processes. Orbital factors can initiate, but the insolation rate of change due to orbital factors would not drive the detailed processes at a centennial scale (Dansgaard Oeschger etc).
As an aside:
Hansen’s earlier paper “Climate change and trace gases” refers to the spring as a key factor in melt, page 8 under figure 3. http://pubs.giss.nasa.gov/abstracts/2007/Hansen_etal_2.html Then proceeds to outline the claim that ice-sheet melt is actually a rapid process.
In Gerard Roe’s “In Defence of Milankovitch”, http://earthweb.ess.washington.edu/roe/Publications/MilanDefense_GRL.pdf he shows that the rate of change of global ice volume tracks beautifully with 65degN insolation.
Hansen’s team have been building up a case that ice-sheet loss is in fact a relatively rapid process so is relevant to policy timescales. I think they have a strong point on this, but am still not convinced about this 6degC sensitivity. I’ll butt out and see what others have to say.
Ike Solem says
Blair, I would guess that the last glacial maximum, LGM, some 19-30 kyr ago, was chosen instead of the Pliocene, 3 million years ago, because there is a lot more data on conditions during the LGM (no ice core measurements from the Pliocene, for example).
The whole notion of the surface temperature response to 2X CO2 climate sensitivity as the main publicly discussed benchmark seems misleading, however. It was originally just a climate model benchmark, wasn’t it? Mostly for the convenience of modelers when comparing their different models.
An equally important variable is the ocean temperature response to increased CO2, which has suffered from large uncertainties. According to Levitus et al (GRL 2004):
Compare this to a more dated estimate (Science 2000):
That is a large change in estimates. A number like 0.06C might seem tiny, but it represents far more energy storage in the oceans than a 0.6C increase in the surface temperature field. So, what’s the current best estimate of the ocean temperature sensitivity? What’s the best estimate of the amount of heat the oceans have absorbed?
Based on the evidence in polar regions, where warming rates and ice melt rates have outstripped model predictions, (not much net albedo change yet, is there?), it seems that polewards heat transport by oceans (and atmosphere) might be accelerating the rate of response (not the equilibrium response, but the rate of approach to equilibrium).
This from Levitus 2001:
So, if that’s correct, the ocean is what we want to watch. If we had launched the Triana/DSCOVR climate satellite ten years ago, instead of mothballing it, we’d probably have robust answers to the energy budget question, and we could get the ocean heat change by calculating the (total energy change)-(atmospheric warming).
Regarding the “monsters in the box”, what that points to is the “labile carbon store” which is currently in soils, sediments, and permafrost: the carbon in the freezer. Warm up the freezer, and all of a sudden microbes can start converting it all to CO2, H2S, etc. Stinky. Obviously the biosphere will play a huge role in that – but we shouldn’t expect the biosphere to absorb that excess carbon. It seems that during stable climate periods the biosphere operates mostly in steady state, pushing 100 gigatons of carbon through the system each year via photosynthesis and respiration, but in balance. That’s the big looming issue.
Lynn Vincentnathan says
RE #36 & “Our study found little *long term* impact, because the ice largely regrew each winter. That included albedo effects.”
I’ve sort of always thought (reducing) albedo might not be much of a long term issue, esp once all the ice/snow is melted.
However, it seems to me that ice forming in winter in the Arctic wouldn’t have a tremendous +albedo (cooling) effect either, since there’s not as much sun up there in the winter as in the summer. OTOH, I think that the sun’s energy gets used up a lot in the melting process of that ice each summer, rather than in warming the planet, so if there were no or little ice formation in the winter and not much to melt in the summer, that would have an impact (like a jump up in temp). Not a continual one, once equilibrium is reached.
I’d think its the lesser or greater ice cover during those long summer days that would should the greater albedo effect.
Thomas Lee Elifritz says
Albedo is the one thing we will have great control over in the future, with the 10 billion people all wanting a modern and comfortable lifestyle. That’s going to take a lot of solar panels and other two dimensional surfaces, in fact, combining agriculture, land use and biodiversity preservation, one can make the assumption that it is going to take ALL of the available surfaces. Here is where material science and condensed matter physics again comes into play – these surfaces can be exceeding thin, and they can be engineered to either be reflective, transparent, transmissive or emissive, with an electronically controlled and very fast response times.
Work that into your global models. We need a very large area for energy anyways, and if you add up the dead areas we have created already, there’s plenty of room here.
D Price says
Re #49 solar power would not take all that much space. A concentrated solar system in desert area 250km by 250km would supply all the Worlds current electricity demands.