Guest Commentary from Patrick Brown and Wenhong Li, Duke University
We recently published a study in Scientific Reports titled Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. Our study seemed to generated a lot of interest and we have received many inquires regarding its findings. We were pleased with some of coverage of our study (e.g., here) but we were disappointed that some outlets published particularly misleading articles (e.g, here, here, and here). Since there appears to be some confusion regarding our study’s findings, we would like to clarify some points (see also MM4A’s discussion).
Our study is mainly about unforced (i.e., internally generated) variability in global mean surface temperature. This is the type of variability that comes from natural interactions between the ocean and the atmosphere (i.e., that due to phenomena like the El-Niño/Southern Oscillation or perhaps the Atlantic Multidecadal Oscillation). Most previous estimates of the magnitude of unforced variability have come from physical climate models, because data directly from the real world is always a mix of internal variability and (potentially) external driven change. In our study, we created an alternative statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records corrected for external forced changes. We then used this new estimate of unforced variability to aid in our interpretation of observed global mean temperature variability since 1900.
We found that unforced variability is large enough so that it could have accounted for multidecadal changes in the rate-of-increase of global average surface temperature over the 20th century. However, our estimate of unforced variability was NOT large enough to account for the total warming observed over the 20th century. Therefore, our results confirm that positive radiative forcings (e.g., from human-caused increases in greenhouse gas concentrations) are necessary in order for the Earth to have warmed as much as it did over the 20th century.
We also found that over the most recent decade or so, it is unlikely that the underlying global warming signal (the long-term externally forced component of temperature change) has been increasing at a rate characteristic of the worst-case IPCC emissions scenario (more on this below).
This last finding is what generated most of the attention for our article but it would appear that the finding has been largely misinterpreted. Part of the confusion stems form the Duke University press release which used the headline “Global Warming More Moderate Than Worst-Case Models”. The news department created this headline as a replacement for our suggested headline of “Global Warming Progressing at Moderate Rate, Empirical Data Suggest”. The news department wanted a shorter headline that was easier for the public to understand. Unfortunately, the simplification led many to believe that our study made a forecast for moderate global warming in the future, when in fact our conclusion only applied to the recent past.
Below are some clarifications on specific questions that we have received:
Question: What does your study conclude about Climate Sensitivity (e.g., how much warming we expect for a given change in greenhouse gasses)?
Answer: Nothing. Our study was not concerned with assessing climate sensitivity and we have no particular reason to doubt the assessments of climate sensitivity from the IPCC (see this post and previous discussions).
Question: Does your study conclude that all climate models used by the IPCC are useless?
Answer: No. Results from our previous study indicated that the magnitude of unforced variability simulated by climate models may be underestimated on decadal and longer timescales and our new empirical estimate of unforced variability largely supports this conclusion. However, our new estimate is not radically different from that simulated by climate models and for the most part we find that climate models seem to get the ‘big picture’ correct.
Question: Does your study indicate the warming from the 1970s to the present may be natural rather than human caused?
Answer: Our study is not explicitly an attribution study and we do not attempt to quantify the anthropogenic contribution to warming over the past ~40 years. However, we were interested in how much influence unforced variability might have had on changes in the rate of warming over the instrumental period. Specifically, we wanted to know if the real climate system is more like panel-a or panel-b below. In panel-a, the magnitude of unforced variability is small (represented by the narrow range between the blue lines), thus changes in the multidecadal rate of warming would necessarily be due to corresponding changes in the externally forced component of warming. In panel-b the magnitude of unforced variability is large (wide range between the blue lines) and thus changes in the multidecadal rate of warming could come about due to unforced variability.
The results of our study indicate that multidecadal changes in the rate of global warming can indeed be attributable to unforced variability and thus the climate system may be more like panel-b than panel-a. This means that the accelerated warming over the last quarter of the 20th century would not ipso-facto require an acceleration in the forced component of warming. Instead, this accelerated warming could have come about due to a combination of anthropogenic forcing and unforced variability. This interpretation of the temperature record is consistent with some previous studies like Del Sole et al (2011), Tung and Zhou (2013), Wu et al., (2011), Fu et al. (2011) and Kratsov et al. (2014). To reiterate, our results indicate that the cumulative warming since the beginning of the 20th century was not possible without positive radiative forcing.
[Editorial Note: On a similar topic, a recent post by Mike discussed problems that arise if one doesn’t properly account for the forced component when estimating internal multidecadal variability, and suggested that it has rather modest (~0.1ºC peak amplitude). He concludes that internal variability has likely partially offset large-scale anthropogenic warming in recent decades, in line with an independent recent study by Dai et al. (2015). Schurer et al (2013) also used the CMIP5 forced last millennium simulations to conclude that internal variability on the multidecadal (50 year) timescale was unlikely to have explained any more than about 25% of the observed temperature trend from 1960.]
Question: Does your study rule-out the rate of warming associated with the IPCC’s RCP8.5 emissions scenario?
Answer: No. We used the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring currently. We then asked the question “how likely is it that the forced component of global warming has been this steep, given recent temperature trends?” We found that it was not very likely to observe an 11-year warming hiatus (2002-2013 in GISTEMP) if the underlying forced warming signal was progressing at a rate characteristic the RCP8.5 scenario. Since the mean radiative forcing progression in RCP8.5 is likely steeper than the radiative forcing progression of the recent past, this finding cannot be used to suggest that models are overestimating the response to forcings and it cannot be used to infer anything about future rates of warming.
We would invite all interested people to read the full paper (it is Open Access) for a more complete explanation and discussion of this complicated subject.
- P.T. Brown, W. Li, E.C. Cordero, and S.A. Mauget, "Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise", Scientific Reports, vol. 5, 2015. http://dx.doi.org/10.1038/srep09957
- P.T. Brown, W. Li, and S. Xie, "Regions of significant influence on unforced global mean surface air temperature variability in climate models", Journal of Geophysical Research: Atmospheres, vol. 120, pp. 480-494, 2015. http://dx.doi.org/10.1002/2014JD022576
- T. DelSole, M.K. Tippett, and J. Shukla, "A Significant Component of Unforced Multidecadal Variability in the Recent Acceleration of Global Warming", Journal of Climate, vol. 24, pp. 909-926, 2011. http://dx.doi.org/10.1175/2010JCLI3659.1
- K. Tung, and J. Zhou, "Using data to attribute episodes of warming and cooling in instrumental records", Proceedings of the National Academy of Sciences, vol. 110, pp. 2058-2063, 2013. http://dx.doi.org/10.1073/pnas.1212471110
- Z. Wu, N.E. Huang, J.M. Wallace, B.V. Smoliak, and X. Chen, "On the time-varying trend in global-mean surface temperature", Climate Dynamics, vol. 37, 2011. http://dx.doi.org/10.1007/s00382-011-1128-8
- C. Fu, C. Qian, and Z. Wu, "Projection of global mean surface air temperature changes in next 40 years: Uncertainties of climate models and an alternative approach", Science China Earth Sciences, vol. 54, pp. 1400-1406, 2011. http://dx.doi.org/10.1007/s11430-011-4235-9
- S. Kravtsov, M.G. Wyatt, J.A. Curry, and A.A. Tsonis, "Two contrasting views of multidecadal climate variability in the twentieth century", Geophysical Research Letters, vol. 41, pp. 6881-6888, 2014. http://dx.doi.org/10.1002/2014GL061416
36 Responses to "Global warming and unforced variability: Clarifications on recent Duke study"
Elmar Veerman says
My compliments to your News Department! “Global Warming Progressing at Moderate Rate, Empirical Data Suggest” would have been a highly misleading headline, so I’m glad they changed it. There is nothing ‘moderate’ about the current rate of global warming if you compare it to the natural situation, or typical rates in the past thousands of years. Which is the comparison most readers would implicitly make, I think. And so your headline would probably lead them to think: ah, so there’s nothing to worry about.
Patrick Brown says
Thanks for the comment. That is a reasonable point. The word ‘moderate’ was used relative to the steepest rates of warming projected by climate models. It was not necessarily intended to mean ‘moderate’ compared to past climate changes. Also, we were specifically looking at temperatures over the ‘hiatus’ period where the meme has been “no global warming”. Our results showed that despite the hiatus, the underlying global warming signal could very well be progressing at pace characteristic of a middle-of-the-road (or ‘moderate’) emissions scenario. There is only so much that can be communicated in a short headline…
Thanks for a very interesting effort to characterize the range of natural variability, and for blogging here and elsewhere to clear up misconceptions.
Having read your paper shortly after it appeared, and made modest attempts to address online misunderstandings of its implications, I do still find the RCP 8.5 comparison confusing, though, perhaps inherently so. At first glance, I actually thought, “but they can’t rule out RCP 8.5 as a future pathway”… and quickly realized that, of course, that wasn’t what you were doing at all.
But it also seems a bit confusing as a way of characterizing the recent past forcing, because the relatively low probability of the recent slowdown occurring at any point over the 1993–2050 period under RCP 8.5 seems chiefly due to developments that have not yet taken place. As your figure 2 makes clear, the forced signal from the RCPs doesn’t diverge much up to the present day, but in RCP 8.5 it accelerates noticeably towards 2050.
Guido van der Werf says
We published last year a related study building on the work of Forster, Chylek, Zhou, Tung, etc. and coming to similar conclusions as this study by Brown and colleagues. It highlights uncertainties related to the AMO characterisation and points towards a transient climate response of 1.6 (1.0-3.3).
The thing I do not grasp is how realistic a strong role of the AMO in moderating global temperatures is (via teleconnections?), curious to hear the opinion of the climate modellers here.
Edward Greisch says
Thank you for this excellent article. There are 3 types of misinterpretations: Intentionally hostile, intentional comedians and those who truly don’t understand. You are doing well to have gotten the message across to as many people as you have. What to do about the rest takes more psychology than I can muster.
To explain the “18 year gap,” we may have to wait until it gets hotter. Very few people have the mathematical talent and education required to understand even simple things that require math. The questions are not unexpected, but they are not intentionally hostile either. You are doing very well at being patient enough to answer them.
MA Rodger says
My misgivings with the work of KK Tung appears also with the paper reviewed here.
Tung is wedded to some cyclic variability, defending his attribution of the mid-20th century wobble to AMO by seeking similar wobbles within proxy data & sparce instrument records of previous times. This has led me to ask what scale of phenomenon would be required to give global temperatures a 0.2ºC wobble that lasts 60 years or more peak-to-peak. I cannot see there is a promising answer to such a question as the phenomenon would have to be unduly large and thus evident within our modern climate data, unless ECS is high.
While there is no such cyclic variability being proposed here, the conclusion (that “the climate system may be more like panel-b than panel-a”) steps into what I consider difficult territory. The paper’s Figure 2a portrays global tempeature depressed by natural variation and Figure 2b shows it elevated and for 20-odd years. My contention is that, certainly in the case of Figure 2b, a natural variation of such a size should be noticable within the various climatic data available for recent decades but when I look it isn’t evident.
The misrepresentation of Brown & Li’s work by Rush Limbaugh comes as no surprise–
even Roy Spencer has made similar complaints, and Roy was Rush’s erstwhile Science Adviser!
Thanks for this article. I admit to being confused by the comparison of past observations with RCPs, when the RCPs haven’t started to deviate from one another yet.
I probably misunderstood your paper (and should have asked you directly at the time), but I was puzzled about why you compared actuals from 1993 to 2013 with projections from 1993 to 2050, and then drew conclusions about 2002 to 2013.
Unless I misunderstood the paper. Could you clarify?
(I’ll add a link to the article I wrote at the time.)
Patrick Brown says
I understand your confusion – we could have been more explicit about the meaning of the calculation.
The idea was to attempt to quantify how likely various rates of forced warming might be, given recent observed trends. We used multi-model mean rates of warming associated with the different emissions scenarios to represent 3 different forced rates of warming. So for example, the multi-model mean rate of warming for the historical simulation + RCP 8.5 from 1993-2050 was ~0.03 C/yr. We then quantified how likely it might be that the forced rate of warming had been 0.03 C/yr, given that we had just seen an 11-year negative linear trend. Therefore, this comparison gives us information on the likelihood of the rate of forced warming (0.03 C/yr) over the recent past but it does not give us information on the likelihood of the RCP 8.5 progression in the future. We simply labeled the 0.03 C/yr forced rate of warming “RCP 8.5” since 0.03 C/yr approximates the characteristic rate of warming for RCP 8.5 (at least through 2050). Instead of labeling it “RCP 8.5” we could have labeled it “0.03 C/yr forced warming” or something similar.
Patrick, thanks for the reply. Yes, the different labeling would have made things clearer for me.
The other thing I noticed was that your trend calculations all used 2013 as an end point. To what extent does the choice of 2013 as an end point to your calculations affect your results? For example, would 2014, or if 2015 ends up being still hotter, make a large difference overall? (Or 2010 or 2005.)
Rob Ellison says
There are three possible modes of climate change. The traditional view of a gradual change with changes in climate forcing. Periodic oscillation around an equilibrium climate state. Abrupt climate shifts to a new mean and variance.
The third is the reality and temperature is more or less the result of chaotic responses in the climate system. The ubiquitous multi-decadal ‘oscillations’ are – in this view – chaotic climate shifts in a dynamically complex system at 20 to 30 year intervals with shifts in the state of the Pacific Ocean and in surface temperature. The current cool Pacific Ocean state seems more likely than not to persist for 20 to 30 years from 2002. The flip side is that – beyond the next decade or so – the evolution of the global mean surface temperature may hold surprises on both the warm and cold ends of the spectrum.
‘The global climate system is composed of a number of subsystems — atmosphere, biosphere, cryosphere, hydrosphere and lithosphere — each of which has distinct characteristic times, from days and weeks to centuries and millennia. Each subsystem, moreover, has its own internal variability, all other things being constant, over a fairly broad range of time scales. These ranges overlap between one subsystem and another. The interactions between the subsystems thus give rise to climate variability on all time scales.’ Michael Ghil
Complexity science suggests that small changes in control variables – including greenhouse gases – push the climate system past a threshold at which stage the balance between the components of the system shift abruptly into a new balance. It is in the jargon known as unforced internal variability. The US National Academy of Sciences (NAS) defined abrupt climate change as a new climate paradigm as long ago as 2002. A paradigm in the scientific sense is a theory that explains observations. A new science paradigm is one that better explains data – in this case climate data – than the old theory. Like the doyens of climate science who wrote Abrupt climate science: inevitable surprises – we can expect surprises that may in theory be faster and more extreme – either cold or warm – than a steady rise in global temperatures. It does suggest that mitigation of greenhouse gases – and of land use changes and aerosol emissions – is prudent in an inherently unstable system. One that has seen many major shifts in temperature and hydrology over the past 2.58 million years. But it does make climate an intractable problem over the longer term. As James McWilliams wrote ‘sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable.’
On the other hand – proponents of modes 1 and 2 – from either side of the climate lines – know no humility. There is little in the way of complexity and nothing of uncertainty.
But if you do get past mode 2 – and it took forever to get from mode 1 to mode 2 – due warning there are a few questions that might come up. How much of 20th century warming was natural and how much of that might we lose in the the 21st? How might that be related to a 20th century millennial high in El Nino frequency and intensity. How much can climate shift – warm or cool – in as little as a decade?
Patrick Brown says
2013 wasn’t really ‘chosen’ as an endpoint. 2013 was the most recent annual datapoint when the study was submitted so it was the natural endpoint to use. The results of the study applied to past temperature progression so future data updates will not change the conclusions about the past. For example, no matter how warm 2015 ends up being, there was still an 11-year negative linear trend from 2002-2013 and we can still quantify how likely various rates of forced warming were, given that negative linear trend occurred. Having said that, I would like to keep this analysis up-to-date and I plan on re-running the calculation for 2014 (and 2015, 2016 ect. when the data comes in) and publishing the results on my website.
Ellison, ENSO contributes to the variability in global temperature but since ENSO reverts to a mean of zero, it does not have any impact on the long-term temperature trend.
This is not that hard to understand, and I am not sure what your issue is.
Barton Paul Levenson says
An 11-year sample won’t tell you anything of significance. You want at least 30 years for that.
It was a nice paper Patrick, and well written (after digesting the alphabet soup!). Figure 3 should be required viewing for “pausers”…
A challenge with the approach must be heteroscedasticity, which you don’t appear to mention(?) — is past unforced variation a guide to future unforced variation in a greatly perturbed climate system. Or even, is past unforced variation a good guide to recent unforced variation? Probably it is, but I doubt we’re sure, which may be what Rob Ellison is trying to say.
Also, just looking at panels a) and b), one can lose sight of the scale. Warming to date is pretty minor given what we face. It does happen to fit a simplistic model with a constant increase in the rate of change (a 3rd order polynomial), which also happens to neatly extrapolate to the mid points of the RCP6.0 projections. Higher order acceleration is required to get to RCP8.5, so it would be surprising if your approach found recent warming to be consistent with an “RCP8.5 rate through 2050”:
Natural variability: what is it, where does it come from? Why it is different in the Atlantic from the Pacific?
Another look at the North Atlantic’s SST trend lines could open a path to resolution of the enigma.
Ideas to the further understanding start HERE
Rob Ellison says
‘The fact that around 1910, 1940, and in the late 1970s climate shifted to a completely new state indicates that synchronization followed by an increase in coupling between the modes leads to the destruction of the synchronous state and the emergence of a new state.’ http://onlinelibrary.wiley.com/wol1/doi/10.1029/2007GL030288/full
The latest climate shift was in 1998/2001 – and although changes in the intensity and frequency of ENSO is implicated in changes in global surface temperature it is a mistake to ignore resonant changes in the global climate system. It involves shifts in the balance of atmosphere, biosphere, cryosphere, hydrosphere and lithosphere – physical mechanisms of unforced variability in a complex dynamical system. The mechanism of unforced variability is dynamical complexity.
ENSO frequency and intensity varies oven decades to millennia. The 1000 year peak in El Nino frequency last century – https://watertechbyrie.files.wordpress.com/2014/06/vance2012-antartica-law-dome-ice-core-salt-content.png – http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00003.1?journalCode=clim
Over the Holocene – Moy et al (2002) present the record of sedimentation shown below which is strongly influenced by ENSO variability. It is based on the presence of greater and less red sediment in a lake core. More sedimentation is associated with El Niño. It has continuous high resolution coverage over 11,000 years. It shows periods of high and low ENSO activity alternating with a period of about 2,000 years. There was a shift from La Niña dominance to El Niño dominance that was identified by Tsonis 2009 as a chaotic bifurcation – and is associated with the drying of the Sahel. There is a period around 3,500 years ago of high ENSO activity associated with the demise of the Minoan civilisation (Tsonis et al, 2010). Red intensity was in excess of 200. For comparison – red intensity in 1997/98 was 99. It shows ENSO variability considerably in excess of that seen in the modern period.
Unforced variability over the much longer term than the instrumental record is the best guide to the potential for abrupt shift in climate. This is not remotely +/- 0.1 degrees C. It is 16 degrees C in places in as little as a decade. – http://www.nap.edu/openbook.php?record_id=10136&page=1 –
Complexity theory suggests that the system is pushed by greenhouse gas changes and warming – as well as solar intensity and Earth orbital eccentricities – past a threshold at which stage the components start to interact chaotically in multiple and changing negative and positive feedbacks – as tremendous energies cascade through powerful subsystems. Some of these changes have a regularity within broad limits and the planet responds with a broad regularity in changes of ice, cloud, Atlantic thermohaline circulation and ocean and atmospheric circulation.
The recurrent dynamics current cool Pacific Ocean state seems more likely than not to persist for 20 to 30 years from 2002. The flip side is that – beyond the next few decades – the evolution of the global mean surface temperature may hold surprises on both the warm and cold ends of the spectrum (Swanson and Tsonis, 2009).
The criticism of the paper is that it takes a short view and is based on mode 2 – see comment 11. The reality of unforced variability is mode 3 – dynamical complexity – and exploring the implications of this is the future of climate science.
Rob Ellison says
‘Or even, is past unforced variation a good guide to recent unforced variation? Probably it is, but I doubt we’re sure, which may be what Rob Ellison is trying to say.’
Past unforced variability is well outside the +/- 0.1 degree C. The scope of future climate shift – timing and extent – is not remotely predictable. If we assume the system is ergodic then it may stay within the limits of the Quaternary.
Well, the cool phase of the Pacific obviously did not last 20 to 30 years. Ellison, your heroes married the wrong ocean cycle:
Ellison, There is no such thing as “complexity theory” apart from the sense of algorithmic complexity in the computational sciences. ENSO does not arise because of any notions of “complexity theory”.
You continue to add FUD to the discussion by making all these pretentious statements that when it comes down to it are simply laughable.
Rob Ellison says
‘This study uses proxy climate records derived from paleoclimate data to investigate the long-term behaviour of the Pacific Decadal Oscillation (PDO) and the El Niño Southern Oscillation (ENSO). During the past 400 years, climate shifts associated with changes in the PDO are shown to have occurred with a similar frequency to those documented in the 20th Century. Importantly, phase changes in the PDO have a propensity to coincide with changes in the relative frequency of ENSO events, where the positive phase of the PDO is associated with an enhanced frequency of El Niño events, while the negative phase is shown to be more favourable for the development of La Niña events.’ http://onlinelibrary.wiley.com/wol1/doi/10.1029/2005GL025052/full
Blue is dominant. I’m not sure that putting a trend line to the PDO means anything at all. We are looking at a tendency to increased upwelling in the north eastern Pacific this century. There are 20 to 30 year regimes in the data and abrupt changes between.
Similarly with ENSO. Moderate and less frequent El Nino this century.
JCH shows the PDO trending down since 1980. The reality is that it was positive to 1998 and shifted negative in 1998/2001. He then shows a recent trend line up. There is no suggestion that there are not periods of warm surface water in cool PDO and vice versa. Just a tendency for enhanced upwelling in cool modes.
It is certainly too early too tell whether the PDO has shifted again. What is important – however – is the underlying dynamical mechanism of abrupt climate change in the globally resonant system.
JCH should take a long view based based on mode 3 (comment 11) thinking.
@ 19, 20 & 21
Pacific ocean floor is a thin membrane oscillating under influence of the magma pressure points (tectonics !) with its multidecadal oscillations directly reflected in the SOI (southern oscillation index).
Hank Roberts says
This stuff isn’t news. It’s interesting as science.
Hank Roberts says
Do these words mean anything? I’d ask for a cite, but I’ve watched this dance play out on many blogs over several years. If you’d got something publishable, somewhere, written up, why would you keep flogblogging the basic message, which appears to be we don’t know, we can’t know, and we shouldn’t spend any money to stop burning fossil carbon. I think that sums up the message, eh?
chris korda says
Given that the RCPs have barely begun to diverge (as this graph clearly shows), would someone please explain the significance of the finding? Isn’t it a bit premature to judge “that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario”? We’ll be in a much better position to judge that in 2030. Of course by then we’ll be toast regardless of whether it’s 8.5 or 6.0. Guess we should act as if we’re on RCP 8.5 just in case! I ask because several commenters here have opined in the fairly recent past that RCP 8.5 is reality and the rest mere wishful thinking.
Why do you say confusion? From ppl and websites like that it is not confusion it is outright lies. If its a fear of legal issues, well I suppose OK, but not calling them out just helps their case IMHO.
Consider a planet with an atmosphere and an ice cap over a liquid ocean. Suppose natural variability causes the ice cap to vary in thickness from ~20 m to 2 m on a time frame of 100k years. This causes only a small signal in the air temperature because the air is always in contact with ice.
Suppose then, changes in the atmosphere cause warming, such that a meter of ice melts on a time frame of 50 years.
If this occurs when the ice is thick, heat from the air is transferred to the ice, and there is very little signal of warming in the air as the ice melts over an extended period that depends on the exact phase of the natural ice thickening/thinning process.
On the other hand, if the atmospheric changes occur when natural cycle causes the ice to be thin, then there will be dramatic changes in the temperature of the air as it is suddenly exposed to liquid water.
Thus, at one part of the cycle, a given amount of additional warming is trivial, and at another point in the cycle, it causes catastrophic loss of sea ice – all because of natural variability in the system that may not be detectable from an air temperature signal.
If we intend to be around for a while, we need to understand natural variation with absolute certainty. Natural variation can either hide AGW, or magnify AGW’s effects. Just because we see a warming signal from natural variation, does not mean that we can ignore a signal from AGW – what counts is always the total signal.
MA Rodger says
I feel your comment dances round my own take on the paper under discussion so forgive me my ten penn’orth.
It is entirely correct that we need to understand natural variation. That was the reason for the work that led to the hockey stick and which also identified the AMO.
The likes of the AMO or PDO give rise to people attributing such phenomena to certain wobbles in global surface temperature. Yet if the likes of AMO or PDO did cause a 50-year or so wobble in surface temperature of a couple of tenths degrees C peak to peak, this tempeature imbalance is a deviation from the equilibrium global temperature and has to be sustained in some manner. Unless there is some physical process at work, hotter global surface will be leaking energy up into space. And unless there is another physical process at work, a hotter global surface will be leaking energy down into the oceans.
Over the hot part of a long 50-odd year cycle, that is a lot of energy. If ECS is small, something many advocates of such cycles tell us is the implications of such cycles, that is a massive amount of energy.
My question to those advocating such cycles is “Where is this energy? It has to be evident within today’s climatic data.”
The post here doesn’t venture to advocate a Big Natural Oscillation but it is suggesting in its “more like panel-b than panel-a” that over the recent few decades global temperatures have been pushed high above equilibrium due to natural variation. Even with ECS=3.0ºC we would require 60Zj to maintain such a two-decade imbalance without some physical processes in place – and that is potentially 60Zj up and 60Zj down. That is a lot of energy to go missing.
Global warming is real. We need to do something about it individually and collectively.Thanks for the incisive article.
Patrick Brown says
@ 14 Barton Paul Levenson,
I agree that the longer the sample, the more you can infer about the underlying global warming signal – This is the point of our Figure 3a, 3b and 3c. However, 11 years doesn’t tell you nothing. It can be used to make statistical inferences about the likelihood of various rates of forced warming – This is the point of our Figure 3d, and 3e.
@ 15 GlenFergus,
You are correct that the stationarity assumption may potentially prove to be problematic. I have looked into this issue in the CMIP5 models and they do not seem to show an obvious change in the magnitude/character of unforced variability as the climate warms. For example, if you look at the magnitude of low frequency unforced variability in each model’s preindustrial control run and compare it to that model’s unforced variability at the end of its RCP8.5 run, they seem to be consistent. Obviously the baseline climate will be quite different between these two time periods (e.g., there is less sea ice at the end of each model’s RCP8.5 run than there is in its preindustrial control run) so this result is interesting. This could be explained by offsetting effects from different parts of the climate system and it is something that I am currently looking into.
@ 25 chris korda
You are correct about the divergence of the RCP scenarios. The question that you bring up has been asked by many others and it was one of the reasons we wanted to write this blog post. Please see our answer to the last question in the main post as well as my comment #9 above.
@ 28 MA Rodger,
Positive feedbacks within the climate system can act to enhance the magnitude and duration of unforced changes in surface temperature (Brown et al. 2014) and several climate models do simulate large-magnitude, long-timescale, unforced fluctuations away from their equilibrium value (see for example Drijfhout et al., 2013 for an interesting case study). Because we do not have a full account of the climate system’s energy inventory over the 20th century, it is difficult to confidently attribute observed decadal variability to forced vs. unforced causes. The best we can usually do is infer forced vs. unforced from surface temperature patters but even that can be controversial. For example, the multidecadal variability is SSTs in the North Atlantic (which correlates with the multidecadal variability in global mean temperature) has been attributed to unforced variability (e.g., Wu et al., 2011) and forced variability (e.g., Booth et al., 2012).
Barton Paul Levenson says
PB @ 30:
MA Rodger says
Patrick Brown @30
Taking your last point first concerning controversy – I see it differently.
Attributing forced/unforced elements of climate variation I think is highly controversial when using pattern in the manner of Wu et al (2011). Their finding, that the unforced contribution (that they link strongly to AMO) is less than 33% of recent global surface temperature change (last 25 years); this is in my mind amounts to no more than curve-fitting. Yes, the wobbles within regional temperatures can be diminished using the methods of Wu et al. to subtract their Multi Decadal Variation signal. But I’m not convinced they have properly shown this diminution to be attributable to this MDV. And how much of their MDV is forced/unforced remains unanswered. I don’t see any work to check these uncertainties within the Wu et al. findings.
To expand on these ideas, consider the bottom left-hand panel of Wu et al (2011) Figure 9. it shows strong positive MDV signals in the temperature record across much of continental USA. There are indeed wobbles but I am concerned the MDV subtraction is invalid as a means of attribution (whatever the cause of this MDV). From the figure, the Eastern coast of the US should show a strong +MDV signal but is this true? Comparing the regional temperature wobble with AMO (which is not far off Wu et al.’s MDV) doen’t convince me. (See graph here.) The subtraction results in smaller but sharper wobbles that themselves need explanation to validate the attribution of the MDV. The same is true far across continental USA (graphic) yet Wu et al. show this land area as a significant contributor to their global MDV.
I would add that I think it is wrong to see Wu et al (2011) as incompatible with Booth et al (2012). The former make no judgement on the likely contributors to MDV, while the latter attribute 76% or more to forced effects specifically concerning N Atlantic SSTs. Such a hefty proportion would suggest the unforced portion of the MDV of Wu et al. could be greatly different in shape as well as having less ampitude. Perhaps here lies an incompatibility of the two studies as Wu et al. do make rather a big thing of natural variability concluding “we estimate that up to one third of the late twentieth century warming could have been a consequence of natural variability.”
So presenting these two studies as “the best we can usually do” must be a call to up the game. And that would lead me to the first part of your reply to me.
Roscoe Shaw says
Recent warming has been very moderate (~ 15 year time frame) and warming in the “AGW era” (~1950-present) not unusual. Similar in magnitude to the 1908-1941 warming which was most likely natural not AGW. All recent changes very likely much much less than changes 12,000 years ago to 8000 years ago when sea level likely rose ~2m a century for many centuries and temps shot up.
Also, no matter how you crunch the numbers, “forecast minus observed” has been getting worse and worse this century. Another Duke study (Scafetta) shows best fit from “Model forecasts times 0.5”. In other words, models have been forecast twice the observed.
It’s important to get to the source of the model errors if we are to have more success going forward. But I wouldn’t ever let some bad press from the Daily Mail or Rush Limbaugh get me down…LOL
Kevin McKinney says
#33–OK, I know this isn’t the most recent work:
But that graphic appears to show the rise from ~12k BP to ~10k as around 1.5 C. The current warming from pre-Industrial may be less than that–I wouldn’t say “much less”–than , but it’s certainly a much faster rate.
And that, to me, would indeed qualify as ‘unusual.’
Jim Eager says
Roscoe Shaw is peddling the usual denialist tripe.
The fact is once actual observed variations in volcanic, ENSO and solar forcings are taken into account, none of which are predictable by _any_ model, observed warming is well within the range of climate model ensemble projections (Foster & Rahmstorf 2011).
Mal Adapted says
It’s not clear that Roscoe is an AGW-denier, but many deniers think that the scientific case for AGW depends on computer models, when it is actually supported by multiple lines of evidence. Scientists use models to test their understanding of the physical processes that determine climate. Discrepancies between model output and observations, rather than being fatal to the AGW consensus, are opportunities to improve the models by incorporating physical factors that were previously overlooked. For example: Reconciling warming trends, published last year by Gavin Schmidt et al., shows that building in new understanding of “volcanic and solar inputs, representations of aerosols, and El Niño evolution” brings model results back into close agreement with observations.