RealClimate logo


Climate Change and Extreme Summer Weather Events – The Future is still in Our Hands


Summer 2018 saw an unprecedented spate of extreme weather events, from the floods in Japan, to the record heat waves across North America, Europe and Asia, to wildfires that threatened Greece and even parts of the Arctic. The heat and drought in the western U.S. culminated in the worst California wildfire on record. This is the face of climate change, I commented at the time.

Some of the connections with climate change here are pretty straightforward. One of the simplest relationships in all of atmospheric science tells us that the atmosphere holds exponentially more moisture as temperatures increase. Increased moisture means potentially for greater amounts of rainfall in short periods of time, i.e. worse floods. The same thermodynamic relationship, ironically, also explains why soils evaporate exponentially more moisture as ground temperatures increase, favoring more extreme drought in many regions. Summer heat waves increase in frequency and intensity with even modest (e.g. the observed roughly 2F) overall warming owing to the behavior of the positive “tail” of the bell curve when you shift the center of the curve even a small amount. Combine extreme heat and drought and you get more massive, faster-spreading wildfires. It’s not rocket science.

But there is more to the story. Because what made these events so devastating was not just the extreme nature of the meteorological episodes but their persistence. When a low-pressure center stalls and lingers over the same location for days at a time, you get record accumulation of rainfall and unprecedented flooding. That’s what happened with Hurricane Harvey last year and Hurricane Florence this year. It is also what happened with the floods in Japan earlier this summer and the record summer rainfall we experienced this summer here in Pennsylvania. Conversely, when a high-pressure center stalls over the same location, as happened in California, Europe, Asia and even up into the European Arctic this past summer, you get record heat, drought and wildfires.

Scientists such as Jennifer Francis have linked climate change to an increase in extreme weather events, especially during the winter season when the jet stream and “polar vortex” are relatively strong and energetic. The northern hemisphere jet stream owes its existence to the steep contrast in temperature in the middle latitudes (centered around 45N) between the warm equator and the cold Arctic. Since the Arctic is warming faster than the rest of the planet due to the melting of ice and other factors that amplify polar warming, that contrast is decreasing and the jet stream is getting slower. Just like a river traveling over gently sloping territory tends to exhibit wide meanders as it snakes its way toward the ocean, so too do the eastward-migrating wiggles in the jet stream (known as Rossby waves) tend to get larger in amplitude when the temperature contrast decreases. The larger the wiggles in the jet stream the more extreme the weather, with the peaks corresponding to high pressure at the surface and the troughs low pressure at the surface. The slower the jet stream, the longer these extremes in weather linger in the same locations, giving us more persistent weather extremes.

Something else happens in addition during summer, when the poleward temperature contrast is especially weak. The atmosphere can behave like a “wave guide”, trapping the shorter wavelength Rossby waves (those that that can fit 6 to 8 full wavelengths in a complete circuit around the Northern Hemisphere) to a relatively narrow range of latitudes centered in the mid-latitudes, preventing them from radiating energy away toward lower and higher latitudes. That allows the generally weak disturbances in this wavelength range to intensify through the physical process of resonance, yielding very large peaks and troughs at the sub-continental scale, i.e. unusually extreme regional weather anomalies. The phenomenon is known as Quasi-Resonant Amplification or “QRA”, and (see Figure below).

Many of the most damaging extreme summer weather events in recent decades have been associated with QRA, including the 2003 European heatwave, the 2010 Russian heatwave and wildfires and Pakistan floods (see below), and the 2011 Texas/Oklahoma droughts. More recent examples include the 2013 European floods, the 2015 California wildfires, the 2016 Alberta wildfires and, indeed, the unprecedented array of extreme summer weather events we witnessed this past summer.

The increase in the frequency of these events over time is seen to coincide with an index of Arctic amplification (the difference between warming in the Arctic and the rest of the Northern Hemisphere), suggestive of a connection (see Figure below).

Last year we (me and a team of collaborators including RealClimate colleague Stefan Rahmstorf) published an article in the Nature journal Scientific Reports demonstrating that the same pattern of amplified Arctic warming (“Arctic Amplification”) that is slowing down the jet stream is indeed also increasing the frequency of QRA episodes. That means regional weather extremes that persist longer during summer when the jet stream is already at its weakest. Based on an analysis of climate observations and historical climate simulations, we concluded that the “signal” of human influence on QRA has likely emerged from the “noise” of natural variability over the past decade and a half. In summer 2018, I would argue, that signal was no longer subtle. It played out in real time on our television screens and newspaper headlines in the form of an unprecedented hemisphere-wide pattern of extreme floods, droughts, heat waves and wildfires.

In a follow-up article just published in the AAAS journal Science Advances, we look at future projections of QRA using state-of-the-art climate model simulations. It is important to note that that one cannot directly analyze QRA behavior in a climate model simulation for technical reasons. Most climate models are run at grid resolutions of a degree in latitude or more. The physics that characterizes QRA behavior of Rossby Waves faces a stiff challenge when it comes to climate models because it involves the second mathematical derivative of the jet stream wind with respect to latitude. Errors increase dramatically when you calculate a numerical first derivative from gridded fields and even more so when you calculate a second derivative. Our calculations show that the critical term mentioned above suffers from an average climate model error of more than 300% relative to observations. By contrast, the average error of the models is less than a percent when it comes to latitudinal temperature averages and still only about 30% when it comes to the latitudinal derivative of temperature.

That last quantity is especially relevant because QRA events have been shown to have a well-defined signature in terms of the latitudinal variation in temperature in the lower atmosphere. Through a well-established meteorological relationship known as the thermal wind, the magnitude of the jet stream winds is in fact largely determined by the average of that quantity over the lower atmosphere. And as we have seen above, this quantity is well captured by the models (in large part because the change in temperature with latitude and how it responds to increasing greenhouse gas concentrations depends on physics that are well understood and well represented by the climate models).

These findings, incidentally have broader implications. First of all, climate model-based studies used to assess the degree to which current extreme weather events can be attributed to climate change are likely underestimating the climate change influence. One model-based study for example suggested that climate change only doubled the likelihood of the extreme European heat wave this summer. As I commented at the time, that estimate is likely too low for it doesn’t account for the role that we happen to know, in this case, that QRA played in that event. Similarly, climate models used to project future changes in extreme weather behavior likely underestimate the impact that future climate changes could have on the incidence of persistent summer weather extremes like those we witnessed this past summer.

So what does our study have to say about the future? We find that the incidence of QRA events would likely continue to increase at the same rate it has in recent decades if we continue to simply add carbon dioxide to the atmosphere. But there’s a catch: The future emissions scenarios used in making future climate projections must also account for factors other than greenhouse gases. Historically, for example, the use of old coal technology that predates the clean air acts produced sulphur dioxide gas which escapes into the atmosphere where it reacts with other atmospheric constituents to form what are known as aerosols.

These aerosols caused acid rain and other environmental problems in the U.S. before factories in the 1970s were required to install “scrubbers” to remove the sulphur dioxide before it leaves factory smokestacks. These aerosols also reflect incoming sunlight and so have a cooling effect on the surface in the industrial middle-latitudes where they are produced. Some countries, like China, are still engaged in the older, dirtier-form of coal burning. If we continue with business-as-usual burning of fossil fuels, but countries like China transition to more modern “cleaner” coal burning to avoid air pollution problems, we are likely to see a substantial drop in aerosols over the next half century. Such an assumption is made in the Intergovernmental Panel on Climate Change (IPCC)’s “RCP 8.5” scenario—basically, a “business as usual” future emissions scenario which results in more than a tripling of carbon dioxide concentrations relative to pre-industrial levels (280 parts per million) and roughly 4-5C (7-9F) of planetary warming by the end of the century.

As a result, the projected disappearance of cooling aerosols in the decades ahead produces an especially large amount of warming in middle-latitudes in summer (when there is the most incoming sunlight to begin with, and, thus, the most sunlight to reflect back to space). Averaged across the various IPCC climate models there is even more warming in mid-latitudes than in the Arctic—in other words, the opposite of Arctic Amplification i.e. Arctic De-amplification (see Figure below). Later in the century after the aerosols disappear greenhouse warming once again dominates and we again see an increase in QRA events.

So, is there any hope to avoid future summers like the summer of 2018? Probably not. But in the scenario where we rapidly move away from fossil fuels and stabilize greenhouse gas concentrations below 450 parts per million, giving us a roughly 50% chance of averting 2C/3.6F planetary warming (the so-called “RCP 2.6” IPCC scenario) we find that the frequency of QRA events remains roughly constant at current levels.

While we will presumably have to contend with many more summers like 2018 in the future, we could likely prevent any further increase in persistent summer weather extremes. In other words, the future is still very much in our hands when it comes to dangerous and damaging summer weather extremes. It’s simply a matter of our willpower to transition quickly from fossil fuels to renewable energy.

Pre-industrial anthropogenic CO2 emissions: How large?

Filed under: — mike @ 11 October 2018

Guest article by William Ruddiman

Fifteen years after publication of Ruddiman (2003), the early anthropogenic hypothesis is still debated, with relevant evidence from many disciplines continuing to emerge. Recent findings summarized here lend support to the claim that greenhouse-gas emissions from early agriculture (before 1850) were large enough to alter atmospheric composition and global climate substantially.

Marine isotopic stage (MIS) 19 is the closest orbital analog to the current MIS 1 interglaciation (Tzedakis et al., 2012), with similarly small changes in precession (εsinω) and nearly synchronous peaks in sin and obliquity (Fig. 1a, b). MIS 11 was once claimed to be the closest MIS 1 analog (for example, Broecker and Stocker, 2006), but that claim is now rejected because obliquity and precession peaks in MIS 11 were far offset.


Figure 1 Comparison of (a) obliquity and (b) precession (εsinω) trends during MIS19, (green), MIS11 (black) and MIS1 (red). Based on Tzedakis et al. (2012). (c) CO2 trends during MIS19 (black) and MIS1 (red). CO2 data for MIS 19 are from Dome C (Bereiter et al. 2015). CO2 data for MIS 1 are from Law Dome (MacFarling Meure et al. 2006) and Dome C (Monnin et al. 2001, 2004) for MIS1.

 

With MIS 11 eliminated as an analog, the focus is on MIS 19. The CO2 signals early in MIS 1 and MIS 19 (Fig. 1c) reached nearly identical peaks of 270 and 269 ppm, after which the MIS 1 value fell for 4000 years but then rose by 20 ppm to a late pre-industrial 280-285 ppm. In contrast, the MIS 19 CO2 trend continued downward for more than 10,000 years to 245-250 ppm by the time equivalent to the present day. This value is consistent with the 240-245 ppm level proposed in the early anthropogenic hypothesis for a natural Holocene world (with no human overprint). The 35-ppm difference between the two interglaciations is close to the 40-ppm Holocene anomaly inferred by Ruddiman (2003).

A GCM simulation of the MIS 19 time equivalent to today by Vavrus et al. (2018) indicates that the low CO2 values would have caused year-round snow cover (indicative of incipient glaciation) in the Canadian Archipelago and over Baffin Island (an area roughly the size of Greenland), as well as other Arctic regions (see also Ganopolski et al., 2014).

Ruddiman (2003) estimated pre-industrial carbon emissions of 300-320 Gt, based on a back-of-the-envelope compilation of the incomplete forest clearance histories then available (Table 1). [One Gt is one billion tons]. That estimate was for a while rejected as too high by a factor of 5 to 10 (Joos et al., 2004; Pongratz et al., 2008; Stocker et al., 2011. However, Kaplan et al. (2011) found that those estimates had been biased downward because they assumed much smaller early per-capita clearance than the large amounts shown by actual historical data. Those estimates also ignored areas that had been cleared and were not in active agricultural use, but had not yet reforested. Adjusting for these factors, Kaplan and colleagues estimated pre-industrial emissions of 343 GtC.

Erb et al. (2017) averaged 7 estimates of the amount of carbon that would currently be stored in Earth’s potential natural vegetation had there been no human activities (910 GtC) compared to the 460 GtC carbon actually stored there today. They attributed the difference of 450 GtC to cumulative vegetation removal by humans (mostly deforestation). With ~140 GtC of clearance having occurred during the industrial era, that left an estimated 310 GtC as the total removed and emitted to the atmosphere during pre-industrial time. In a similar analysis, Lorenz and Lal (2018) estimated pre-industrial carbon emissions of ‘up to’ 357 GtC.

Studies in other disciplines have begun adding direct ground-truth evidence about early clearance. Analyses of pollen in hundreds of European lake cores (Fyfe et al., 2014; Roberts et al, 2018) show that forest vegetation began to decrease after 6000 years ago and reached near-modern levels before the start of the industrial era (Fig. 2). In China, compilations of over 50,000 archaeological sites by Li et al. (2009) and Hosner et al. (2016) show major increases of farming settlements in previously forested areas beginning 7,000 years ago. These extensive compilations support the above estimates of large early anthropogenic clearance and C emissions.


Figure 2. Evidence of early forest clearance in Europe. (A) Locations of cores in the European pollen database. Cores used for pollen summary in B are shown in red (Fyfe et al., 2015). (B) Changes in forest, open, and semi-open (mixed forest and open) vegetation plotted as ‘pseudobiome’ sums.

 

As this wide-ranging multi-disciplinary evidence has emerged, some scientists continue to reject the early anthropogenic hypothesis. Most of the opposition is based on a geochemical index (δ13CO2) measured in CO2 contained in air bubbles trapped in ice cores. The δ13CO2 index shows the relative balance through time between the amount of 12C-rich terrestrial carbon from the land and 13C-neutral carbon from the ocean. The small 13C decrease in atmospheric CO2 during the last 7000 years has been interpreted as indicating minimal input of 12C-rich terrestrial carbon during that time (Broecker and Stocker, 2006; Elsig et al., 2009). In a July 20, 2018 Scienceonline.org post, Jeff Severinghaus estimated the early human contribution to the observed CO2 rise as “1 to 2 ppm at the most”, or just 5-10% of the recent estimates reviewed in Table 1.

Other scientists (Stocker et al., 2018; Ruddiman et al., 2016) have pointed out that the δ13CO2 index cannot be used to isolate the amount of deforestation carbon unless all significant carbon sources and sinks are well constrained. The compilation by Yu (2011) indicating that ~300 Gt of terrestrial (12C-rich) carbon were buried in boreal peats during the last 7000 years shows that this constraint had not been satisfied in previous studies. Burial of ~300 GtC in boreal peats requires a counter-balancing emission of more than 300 GtC of terrestrial carbon during the last 7000 years, and the discussion above summarizes evidence that pre-industrial deforestation can fill that deficit. Even now, however, carbon exchanges (whether sources or sinks) in non-peat permafrost areas and in river floodplains and deltas during the last 7000 years remain poorly known.

Scientists trying to make up their minds about this still-ongoing debate can now weigh wide-ranging multi-disciplinary evidence for large early forest clearance against reliance on the as-yet poorly constrained δ13CO2 index.

References

Bereiter, B., S. Eggleston, J. Schmitt, C. Nehrbass-Ahles, T. F. Stocker, et al. (2015), Revision of the EPICA Dome C CO2 record from 800 to 600 kyr before present, Geophys. Res. Lett., 42, 542–549.

Broecker, W. S. and T. L. Stocker (2006), The Holocene CO2 rise: Anthropogenic or natural? EOS Trans. Amer. Geophysical Union 87, 27.

Erb, K.-H., T. Kastner, C. Plutzar, C., A. L. S Bais, N. Carvalhai., et al. (2018), Unexpectedly large impact of forest management on global vegetation biomass. Nature 553, 73-76.

Elsig J., J. Schmitt, D. Leuenberger, R. Schneider, M. Eyer, et al. (2009), Stable isotope constraints on Holocene carbon cycle changes from an Antarctic ice core. Nature 461, 507-510.

Fyfe, R. M., J. Woodbridge, and N. Roberts (2015), From forest to farmland: pollen-inferred land cover changes across Europe using the pseudobiomization approach. Global Change Biology 20, 1197-1212.

Ganopolski, A., R. Winkelmann and H. J. Schellenhuber, (2014), Critical insolation-CO2 relation for diagnosing past and future glacial inception. Nature 529, 200-203.

Hosner, D., M. Wagner, P. E. Tarasov, X. Chen, and C. Leipe (2016), Spatiotemporal distribution patterns of archaeological sites in China during the Neolithic and Bronze Age: An overview. The Holocene 26, 1576-1583.

Joos F, Gerber S, Prentice IC, et al. (2004) Transient simulations of Holocene atmospheric carbon dioxide and terrestrial carbon since the last glacial maximum. Global Biogeochemical Cycles 18. DOI: 10.1029/2003GB002156.

Kaplan J. O, K. M. Krumhardt, E. C. Ellis, W. F. Ruddiman, C. Lemmen, et al. Goldewijk (2011), Holocene carbon emissions as a result of anthropogenic land cover change. The Holocene 21, 775-792.

Li, X., J. Dodson, J. Zhou, and X. Zhou (2008), Increases of population and expansion of rice agriculture in Asia, and anthropogenic methane emissions since 5000 BP. Quat. Int. 202, 41-50.

Lorenz, K. and R. Lal (2018), Agricultural land use and the global carbon cycle. In: Carbon sequestration in agricultural systems, p. 1-37.

MacFarling Meure, C., D. Etheridge, C. Trudinger, P. Steele, R. Langenfelds, et al. (2006), Law Dome CO2, CH4 and N2O ice core records extended to 2000 years BP. Geophys. Res. Lett., 33, L14810, doi:10.1029/2006GL026152.

Monnin E., A. Indermühle, A. Dällenbach, J. Flückinger, B. Stauffer, et al. (2001), Atmospheric CO¬¬2 concentrations over the Last Glacial Termination. Science, 291, 112-114.

Pongratz, J., C. Reick, T. Raddatz, and M. A. Claussen (2008), A reconstruction of global agricultural areas and land cover for the last millennium. Global Geochemical Cycles 22, GB3018m doi:10.1029/2008GLO36394.

Roberts N, R. M. Fyfe, J. Woodbridge, et al. (2018), Europe’s forests: A pollen-based synthesis for the last 11,000 years. Nature Scientific Reports. DOI: 10.1038/s41598-017-18646-7
Ruddiman, W. F. (2003), The anthropogenic greenhouse era began thousands of years ago. Climatic Change 61, 261-293.

Ruddiman, W. F., D. Q. Fuller, J. E Kutzbach, P. C. Tzedakis, J. O. Kaplan et al. (2016), Late Holocene climate: Natural or anthropogenic? Rev. of Geophys. 54, 93-118.

Stocker, B. D., K. Strassmann, and F. Joos (2011), Sensitivity of Holocene atmospheric CO2 and the modern carbon budget to early human land use: analyses with a process-base model. Biogeosciences 8, 69-88.

Stocker, B.D., Z. Yu, and F. Joos (2018), Constraining CO2 emissions from different Holocene land-use histories: does the carbon budget add up? PAGES 26, 6-7.

Tzedakis, P. C., J. E. T. Channell, D. A. Hodell, H. F. Kleiven, and L. K. Skinner (2012), Determining the length of the current interglacial. Nature Geoscience 5, 138-141.

Vavrus, S. J., F. He, J. E. Kutzbach, W. F. Ruddiman, and P. C. Tzedakis (2018), Glacial inception in marine isotope stage 19: An orbital analog for a
natural Holocene. Nature Scientific Reports 81, doi:10.1038/s41598-018-28419-5.

Non-condensable Cynicism in Santa Fe

Filed under: — mike @ 17 January 2017

Guest Post by Mark Boslough

The Fourth Santa Fe Conference on Global & Regional Climate Change will be held on Feb 5-10, 2017. It is the fourth in a series organized and chaired by Petr Chylek of Los Alamos National Laboratory (LANL) and takes place intervals of 5 years or thereabouts. It is sponsored this year by LANL’s Center for Earth and Space Science and co-sponsored by the American Meteorological Society. I attended the Third in the series, which was held the week of Oct 31, 2011. I reported on it here in my essay “Climate cynicism at the Santa Fe conference”.

In that report, I described my experiences and interactions with other attendees, whose opinions and scientific competence spanned the entire spectrum of possibility. Christopher Monckton represented one extreme end-member, with no scientific credibility, total denial of facts, zero acknowledgment of uncertainty in his position, and complete belief in a global conspiracy to promote a global warming fraud. At the opposite end were respected professional climate scientists at the top of their fields, such as Richard Peltier and Gerald North. Others, such as Fred Singer and Bill Gray, occupied different parts of the multi-dimensional phase space, having credentials but also having embraced denial—each for their own reasons that probably didn’t intersect.

2011 conference participants share a “Christmas in the trenches” moment on the Santa Fe plaza (author on the upper right; Monckton to his immediate left, with Singer just below)

For me, the Third Conference represented an opportunity to talk to people who held contrary opinions and who promoted factually incorrect information for reasons I did not understand. My main motivation for attending was to engage in dialogue with the contrarians and deniers, to try to understand them, and to try to get them to understand me. I came away on good terms with some (Bill Gray and I bonded over our common connection to Colorado State University, where I was an undergraduate physics student in the 1970s) but not so much with others.

I was ambitious and submitted four abstracts. I and my colleagues were pursuing uncertainty quantification for climate change in collaboration with other DOE labs. I had been collaborating on several approaches to it, including betting markets, expert elicitation, and statistical surrogate models, so I submitted an abstract for each of those methods. I had also been working with Lloyd Keigwin, a senior scientist and oceanographer at Woods Hole Oceanographic Institution and another top-of-his-field researcher. We submitted an abstract together about his paleotemperature reconstruction of Sargasso Sea surface temperature, which is probably the most widely reproduced paleoclimate time series other than the Mann et al. “Hockey Stick” graph. I had updated it with modern SST measurements, and in our abstract we pointed out that it had been misused by contrarians who had removed some of the data, replotted it, and mislabeled it to falsely claim that it was a global temperature record showing a cooling trend. The graph continues to make appearances. On March 23, 2000, ExxonMobil took out an advertisement in the New York Times claiming that global warming was “Unsettled Science”. The ad was illustrated with a doctored version of Lloyd’s graph (the inconvenient modern temperature data showing a warming trend had been removed). This drawing was very similar to one that had been generated by climate denier Art Robinson and his son for a Wall Street Journal editorial a couple months earlier. It wasn’t long before other distorted versions started showing up elsewhere, such as the Albuquerque Journal opinion page. The 2000 ExxonMobil version was just entered into the Congressional Record last week by Senator Tim Kaine during the Tillerson confirmation hearings.

Original Keigwin (1996) graph as it appeared in the journal Science.


Doctored Version of Keigwin (1996) graph that appeared in Robinson et al (1998)



Doctored version of Keigwin (1996) graph used in ExxonMobil advertisement.



In 2011, my abstracts on betting, expert elicitation, and statistical models were all accepted, and I presented them. But the abstract that Lloyd and I submitted was unilaterally rejected by Chylek who said, “This Conference is not a suitable forum for [the] type of presentations described in [the] submitted abstract. We would accept a paper that spoke to the science, the measurements, the interpretation, but not simply an attempted refutation of someone else’s assertions (especially when made in unpublished reports and blog site).” The unpublished report he spoke of was the NIPCC/Heartland Institute report, which Fred Singer was there to discuss. After the conference, I spoke to one of the co-chairs about the reasons for the rejection. He said that he hadn’t seen it and did not agree with the reasons for the rejection. He encouraged Lloyd and me to re-submit it again for the 4th conference. So we did. Lloyd sent the following slightly-revised version on January 4.

Misrepresentations of Sargasso Sea Temperatures by Global Warming Doubters

Lloyd Keigwin (Woods Hole Oceanographic Institution) and Mark Boslough (Sandia National Laboratories)

Keigwin (Science 274:1504–1508, 1996) reconstructed the SST record in the northern Sargasso Sea to document natural climate variability in recent millennia. The annual average SST proxy used δ18O in planktonic foraminifera in a radiocarbon-dated 1990 Bermuda Rise box core. Keigwin’s Fig. 4B (K4B) shows a 50-year-averaged time series along with four decades of SST measurements from Station S near Bermuda, demonstrating that at the time of publication, the Sargasso Sea was at its warmest in more than 400 years, and well above the most recent box-core temperature. Taken together, Station S and paleotemperatures suggest there was an acceleration of warming in the 20th century, though this was not an explicit conclusion of the paper. Keigwin concluded that anthropogenic warming may be superposed on a natural warming trend.

In a paper circulated with the anti-Kyoto “Oregon Petition,” Robinson et al. (“Environmental Effects of Increased Atmospheric Carbon Dioxide,” 1998) reproduced K4B but (1) omitted Station S data, (2) incorrectly stated that the time series ended in 1975, (3) conflated Sargasso Sea data with global temperature, and (4) falsely claimed that Keigwin showed global temperatures “are still a little below the average for the past 3,000 years.” Slight variations of Robinson et al. (1998) have been repeatedly published with different author rotations. Various mislabeled, improperly-drawn, and distorted versions of K4B have appeared in the Wall Street Journal, in weblogs, and even as an editorial cartoon—all supporting baseless claims that current temperatures are lower than the long term mean, and traceable to Robinson’s misrepresentation with Station S data removed. In 2007, Robinson added a fictitious 2006 temperature that is significantly lower than the measured data. This doctored version of K4B with fabricated data was reprinted in a 2008 Heartland Institute advocacy report, “Nature, Not Human Activity, Rules the Climate.”

On Jan. 9, Lloyd and I got a terse rejection from Chylek: “Not accepted. The committee finding was that the abstract did not indicate that the presentation would provide additional science that would be appropriate for the conference.”

I had also submitted an abstract with Stephen Lewandowsky and James Risbey called “Bets reveal people’s opinions on climate change and illustrate the statistics of climate change,” and a companion poster entitled “Forty years of expert opinion on global warming: 1977-2017” in which we proposed to survey the conference attendees:

Forecasts of anthropogenic global warming in the 1970s (e.g. Broecker, 1975, Charney et al., 1979) were taken seriously by policy makers. At that time, climate change was already broadly recognized within the US defense and intelligence establishments as a threat to national and global security, particularly due to climate’s effect on food production. There was uncertainty about the degree of global warming, and media-hyped speculation about global cooling confused the public. Because science-informed policy decisions needed to be made in the face of this uncertainty, the US Department of Defense funded a study in 1977 by National Defense University (NDU) called “Climate Change to the Year 2000” in which a panel of experts was surveyed. Contrary to the recent mythology of a global cooling scare in the 1970s, the NDU report (published in 1978) concluded that, “Collectively, the respondents tended to anticipate a slight global warming rather than a cooling”.

Despite the rapid global warming since 1977, this subject remains politically contentious. We propose to use our poster presentation to survey the attendees of the Fourth Santa Fe Conference on Global and Regional Climate Change and to determine how expert opinion has changed in the last 40 years.

I had attempted a similar project at the 3rd conference with my poster “Comparison of Climate Forecasts: Expert Opinions vs. Prediction Markets” in which my abstract proposed the following: “As an experiment, we will ask participants to go on the record with estimates of probability that the global temperature anomaly for calendar year 2012 will be equal to or greater than x, where x ranges in increments of 0.05 °C from 0.30 to 1.10 °C (relative to the 1951-1980 base period, and published by NASA GISS).” I included a table for participants to fill in, and even printed extra sheets to tack up on the board with my poster so I could compile them and report them later.

This idea was a spinoff of work I had presented at an unclassified session of the 2006 International Conference on Intelligence Analysis on my research in support of the US intelligence community for which a broad spectrum of opinion must be used to generate an actionable consensus with incomplete or conflicting information. That was certainly the case in Santa Fe, where there were individuals (e.g. Don Easterbrook) who were going on record with predictions of global cooling. By the last day of the conference, several individuals had filled in the table with their probabilistic predictions and I decided to leave my poster up until the end of the day, which was how long they could be displayed according to the conference program. I wanted to plug it during my oral presentation on prediction markets so that I could get more participation. Unfortunately when I returned to the display room, my poster had been removed. Hotel employees did not know where it was, and the diverse probability estimates were lost.

This year I would be more careful, as announced in my abstract. But the committee would have no part of it. On Jan 10 I got my rejection letter:

I regret to inform you that we have decided to decline this submission.

Based on our consideration of the abstract and plan, it is our view that designing a survey that accurately elicits expert opinion requires special expertise as the answers can depend on how the questions are asked. No indication of such expertise was presented in the abstract itself or found based on examination of your publication record.

A further concern dealt with the proposed comparison with opinion elicited at a different time from a different community by a different method that might allow one to “determine how expert opinion has changed in the last 40 years.”

Concern was raised also over how one might legitimately transform the results of such a poll into “into probabilistic global warming projections.”

Although we cannot accept this poster, we certainly look forward to your active participation in the Conference.

Of the hundreds of abstracts I’ve submitted, this is the only conference that’s ever rejected one. As a frequent session convener and program committee chair myself, I am accustomed to providing poster space for abstracts that I might question, misunderstand, or disagree with. It has never occurred to me to look at the publication list of a poster presenter, But if I were to do that, I would be more thorough and look other information, including their coauthors’ publication lists and CVs as well. In this case, the committee might have discovered more than a few papers by one of them on the subject, such as Risbey and Kandlikar (2002) “Expert Assessment of Uncertainties in Detection and Attribution of Climate Change” in the Bulletin of the American Meteorological Society, or that Prof. Risbey was a faculty member in Granger Morgan’s Engineering and Public Policy department at CMU for five years, a place awash in expert elicitation of climate (I sent my abstract to Prof. Morgan–who I know from my AGU uncertainty quantification days–for his opinion before submitting it to the conference).

At the very least, I would look at the previous work cited in the abstract. The committee would not have been puzzled by how to transform survey data into probabilistic projections if they had done so. They would have learned that the 1978 NDU study we cited had already established the methodology we were proposing to use. The NDU “Task I” was “To define and estimate the likelihood of changes in climate during the next 25 years…” using ten survey questions described in Chapter One (Methodology). The first survey question was on average global temperature. So the legitimacy of the method we were planning to use was established 40 years ago.

I concluded after the 3rd Santa Fe conference that cynicism was the only attribute that was shared by the minority of attendees who were deniers, contrarians, publicity-seekers, enablers, or provocateurs. I now think that cynicism has something in common with greenhouse gases. Cynicism begets cynicism, to the detriment of society. There are natural-born cynics, and if they turn the rest of us into cynics then we are their amplifiers, just like water vapor is an amplifier of carbon dioxide’s greenhouse effect. We become part of a cynical feedback loop that generates distrust in science and the scientific method. I refuse to let that happen. I might have gotten a little steamed by an unfair or inappropriate rejection, but I’ve cooled off and my induced cynicism has condensed now. I am not going to assume that everyone is a cynic just because of a couple of misguided and misinformed decisions.

As President Obama said in his farewell address, “If you’re tired of arguing with strangers on the Internet, try talking with one of them in real life.” So if you are attending the Santa Fe conference, I would like to meet with you. If you are flying into Albuquerque, where I live, drop me a line. Or meet me for a drink or dinner in Santa Fe. I can show you why Lloyd’s research really does provide additional science that is relevant to the conference. I can try to convince you that prediction markets are indeed superior to expert elicitation in their ability to forecast climate change. Maybe I can even talk you into going on record with your own probabilistic global warming forecast!

Boomerangs versus Javelins: The Impact of Polarization on Climate Change Communication

Filed under: — mike @ 7 June 2016

Guest commentary by Jack Zhou, Nicholas School of the Environment, Duke University

For advocates of climate change action, communication on the issue has often meant “finding the right message” that will spur their audience to action and convince skeptics to change their minds. This is the notion that simply connecting climate change to the right issue domains or symbols will cut through the political gridlock on the issue. The difficulty then lies with finding these magic bullet messages, figuring out if they talk about climate change in the context of with national security or polar bears or passing down a clean environment to future generations.

On highly polarized issues like climate change, however, communicating across the aisle may be more difficult than simply finding the right message. Here, the worst case scenario is not simply a message failing to land and sending you back to the drawing board. Instead, any message that your audience disagrees with may polarize that audience even further in their skepticism, leaving you in a worse position than you began. As climate change has become an increasingly partisan issue in American politics, this means that convincing Republicans to reject the party line of climate skepticism may be easier said than done.
More »

The Early Anthropocene Hypothesis: An Update

Filed under: — mike @ 15 March 2016

Guest post from Bill Ruddiman, University of Virginia

For over a decade, paleoclimate scientists have argued whether the warmth of the last several thousand years was natural or anthropogenic. This brief comment updates that debate, also discussed earlier at RC: Debate over the Early Anthropogenic Hypothesis (2005) and An Emerging View on Early Land Use (2011). The graph below outlines the evolution of that debate through 4 phases.

RuddimanFigure

In phase 1 (the 1900’s), scientists viewed Holocene climate change as driven only by natural causes until the industrial era began. But by the late 1990’s, ice core data revealed late Holocene GHG rises unlike trends in previous interglaciations. Two hypotheses proposed natural causes for the CO2 increase: carbonate compensation (Broecker et al., 1999, 2001) and coral-reef construction (Ridgewell et al., 2003).

In phase 2 (2001-2003), the early anthropogenic hypothesis (EAH) challenged natural explanations for the anomalous late Holocene CO2 (and CH4) rises, attributing them to the spread of early agriculture thousands of years ago.

In phase 3 (2004-2008), several arguments were advanced against the EAH:
* too few people lived millennia ago to have had a significant influence on land clearance, GHG emissions and climate;
* a (proposed) interglacial stage 11 analog for the Holocene suggested that thousands of years of natural warmth still remain in the current interglaciation;
* the weak decrease in ice core δ13CO2 during the last 7000 years did not permit extensive deforestation which would have released abundant 12C -rich carbon.
Papers by myself, my co-authors at Wisconsin, and others during phase 3 rebutted some of these criticisms, but community opinion remained divided.

Phase 4 (2009-2016) has seen a major shift in viewpoint of published papers: 30 papers favor aspects of the EAH, 6 papers oppose it, and 5 are in the middle. Most of the phase 4 papers that oppose the hypothesis or are ‘in the middle’ are based on modeling studies. Many of the 30 supporting papers are broad-scale compilations of archaeological and paleoecological evidence:
* The average GHG trends from 7 previous interglaciations show CO2 and CH4 decreases, in contrast to the late Holocene increases;
* Interglacial stage 19, the closest Holocene analog, shows decreases in CH4 and CO2, and the CO2 decrease closely matches the 2003 EAH prediction;
* CH4 emissions from Asian rice paddies account for 70% of the observed CH4 rise from 5000 to 1000 years ago
* historical data show that early per-capita land use was at least 4 times larger than assumed in several phase-3 land use simulations
* a recent land use simulation based on historical evidence accounts for more than half the CO2 anomaly originally proposed in the EAH;
* pollen evidence shows nearly complete deforestation in north-central Europe before the industrial era began;
* δD and δ18O trends show anomalous late Holocene warmth compared to cooling trends in prior interglaciations, in agreement with A-OGCM simulations of the warming effect of the anthropogenic CO2 and CH4 trends.

_____________
Anyone seeking more detail on this issue should contact pisgahill@gmail.com for pdf copies of the recent 2016 Ruddiman et al. paper in Reviews of Geophysics and an invited paper just submitted to Oxford University Press that summarizes the history of this debate, with full references to the papers shown in the table.

How Likely Is The Observed Recent Warmth?

Filed under: — mike @ 25 January 2016

With the official numbers now in 2015 is, by a substantial margin, the new record-holder, the warmest year in recorded history for both the globe and the Northern Hemisphere. The title was sadly short-lived for previous record-holder 2014. And 2016 could be yet warmer if the current global warmth persists through the year.

One might well wonder: just how likely is it that we would be seeing these sort of streaks of record-breaking temperatures if not for human-caused warming of the planet?

Precisely that question was posed by several media organizations a year ago, in the wake of the then-record 2014 temperatures. Various press accounts reported odds anywhere from 1-in-27 million to 1-in-650 million that the observed run of global temperature records (9 of the 10 warmest years and 13 of the 15 warmest years each having had occurred since 2000) might have resulted from chance alone, i.e. without any assistance from human-caused global warming.

My colleagues and I suspected the odds quoted were way too slim. The problem is that each year was treated as though it were statistically independent of neighboring years (i.e. that each year is uncorrelated with the year before it or after it), but that’s just not true. Temperatures don’t vary erratically from one year to the next. Natural variations in temperature wax and wane over a period of several years. More »

An Online University Course on the Science of Climate Science Denial

Filed under: — mike @ 22 April 2015

Guest post from John Cook, University of Queensland

For many years, RealClimate has been educating the public about climate science. The value of climate scientists patiently explaining the science and rebutting misinformation directly with the public cannot be overestimated. When I began investigating this issue, my initial searches led me here, which was invaluable in increasing my understanding of our climate and making sense of misinformation. RealClimate has inspired and empowered a host of climate communicators such as myself to step forward and help make climate science more accessible to the general public.

To further the work of educating the public, and empowering people to communicate the realities of climate change, the Skeptical Science team has collaborated with The University of Queensland to develop a MOOC, Making Sense of Climate Science Denial. MOOC stands for Massive (we’ve already had thousands of students sign up from over 130 countries) Open (available for free to everyone) Online (web-based, no software required) Course.

The course examines the science of climate science denial. Why do a small but vocal minority reject the scientific evidence for climate change? What techniques do they use to cast doubt on the science? And we examine the all-important question – based on scientific research, how should we respond to science denial?

Several strands of research in cognitive psychology, educational research and a branch of psychology called “inoculation theory” all point the way to neutralising the influence of science denial. The approach is two-fold: communicate the science but also explain how that science can be distorted.

So our course looks at the most common climate myths you’re likely to encounter online or in the media. We examine myths casting doubt on the reality of global warming. We explore the many human fingerprints on climate change. We look at the messages from past climate change and what climate models tell us about the future. And we look at how climate change is impacting every part of society and the environment. As we examine myths touching on all these parts of climate science, we shine the spotlight on the fallacies and techniques used to distort the science.

FLICC

As well as our short video lectures debunking climate myths, we also interviewed many of the world’s leading scientists. I had the privilege to speak to Ben Santer, Katharine Hayhoe, Richard Alley, Phil Jones, Naomi Oreskes and let’s not forget my long, fascinating conversation with Michael Mann. I was also lucky enough to interview Sir David Attenborough at the Great Barrier Reef. We spoke to both climate scientists and social scientists who study the psychology of climate science denial. Some of the most powerful moments from those interviews came when the scientists described the attacks they’d personally experienced because of their climate research:

Our MOOC starts next Tuesday, April 28. It’s a free online course hosted by the not-for-profit edX (founded by Harvard University & MIT). It runs for 7 weeks, requiring 1 to 2 hours per week. You can enroll at http://edx.org/understanding-climate-denial.

A Scientific Debate

Filed under: — mike @ 13 April 2015

Guest posting from Bill Ruddiman, University of Virginia

Recently I’ve read claims that some scientists are opposed to AGW but won’t speak out because they fear censure from a nearly monolithic community intent on imposing a mainstream view. Yet my last 10 years of personal experience refute this claim. This story began late in 2003 when I introduced a new idea (the ‘early anthropogenic hypothesis’) that went completely against a prevailing climatic paradigm of the time. I claimed that detectable human influences on Earth’s surface and its climate began thousands of years ago because of agriculture. Here I describe how this radically different idea was received by the mainstream scientific community.

Was my initial attempt to present this new idea suppressed? No. I submitted a paper to Climatic Change, then edited by Steve Schneider, a well-known climate scientist and AGW spokesman. From what I could tell, Steve was agnostic about my idea but published it because he found it an interesting challenge to the conventional wisdom. I also gave the Emiliani lecture at the 2003 December American Geophysical Union (AGU) conference to some 800 people. I feel certain that very few of those scientists came to my talk believing what my abstract claimed. They attended because they were interested in a really new idea from someone with a decent career reputation. The talk was covered by many prominent media sources, including the New York Times and The Economist. This experience told me that provocative new ideas draw interest because they are provocative and new, provided that they pass the key ‘sniff test’ by presenting evidence in support of their claims.

Did this radical new idea have difficulty receiving research funding? No. Proposals submitted to the highly competitive National Science Foundation (NSF) with John Kutzbach and Steve Vavrus have been fully funded since 2004 by 3-year grants. Even though the hypothesis of early anthropogenic effects on climate has been controversial (and still is for some), we crafted proposals that were carefully written, tightly reasoned, and focused on testing the new idea. As a result, we succeeded against negative funding odds of 4-1 or 5-1. One program manager told me he planned to put our grant on a short list of ‘transformational’ proposals/grants that NSF had requested. That didn’t mean he accepted our hypothesis. It meant that he felt that our hypothesis had the potential to transform that particular field of paleoclimatic research, if proven correct.

Were we able to get papers published? Yes. As any scientist will tell you, this process is rarely easy. Even reviewers who basically support what you have to say will rarely hand out ‘easy-pass’ reviews. They add their own perspective, and they often point out useful improvements. A few reviews of the 30-some papers we have published during the last 11 years have come back with extremely negative reviews, seemingly from scientists who seem deeply opposed to anything that even hints at large early anthropogenic effects. While these uber-critical reviews are discouraging, I have learned to put them aside for a few days, give my spirits time to rebound, and then address the criticisms that are fair (that is, evidence-based), explain to the journal editor why other criticisms are unfair, and submit a revised (and inevitably improved) paper. Eventually, our views have always gotten published, although sometimes only after considerable effort.

The decade-long argument over large early anthropogenic effects continues, although recent syntheses of archeological and paleoecological data have been increasingly supportive. In any case, I continue to trust the scientific process to sort this debate out. I suggest that my experience is a good index of the way the system actually operates when new and controversial ideas emerge. I see no evidence that the system is muffling good new ideas.

Climate Oscillations and the Global Warming Faux Pause

Filed under: — mike @ 26 February 2015

No, climate change is not experiencing a hiatus. No, there is not currently a “pause” in global warming.

Despite widespread such claims in contrarian circles, human-caused warming of the globe proceeds unabated. Indeed, the most recent year (2014) was likely the warmest year on record.

It is true that Earth’s surface warmed a bit less than models predicted it to over the past decade-and-a-half or so. This doesn’t mean that the models are flawed. Instead, it points to a discrepancy that likely arose from a combination of three main factors (see the discussion my piece last year in Scientific American). These factors include the likely underestimation of the actual warming that has occurred, due to gaps in the observational data. Secondly, scientists have failed to include in model simulations some natural factors (low-level but persistent volcanic eruptions and a small dip in solar output) that had a slight cooling influence on Earth’s climate. Finally, there is the possibility that internal, natural oscillations in temperature may have masked some surface warming in recent decades, much as an outbreak of Arctic air can mask the seasonal warming of spring during a late season cold snap. One could call it a global warming “speed bump”. In fact, I have.

Some have argued that these oscillations contributed substantially to the warming of the globe in recent decades. In an article my colleagues Byron Steinman, Sonya Miller and I have in the latest issue of Science magazine, we show that internal climate variability instead partially offset global warming.

We focused on the Northern Hemisphere and the role played by two climate oscillations known as the Atlantic Multidecadal Oscillation or “AMO” (a term I coined back in 2000, as recounted in my book The Hockey Stick and the Climate Wars) and the so-called Pacific Decadal Oscillation or “PDO” (we a use a slightly different term–Pacific Multidecadal Oscillation or “PMO” to refer to the longer-term features of this apparent oscillation). The oscillation in Northern Hemisphere average temperatures (which we term the Northern Hemisphere Multidecadal Oscillation or “NMO”) is found to result from a combination of the AMO and PMO.

In numerous previous studies, these oscillations have been linked to everything from global warming, to drought in the Sahel region of Africa, to increased Atlantic hurricane activity. In our article, we show that the methods used in most if not all of these previous studies have been flawed. They fail to give the correct answer when applied to a situation (a climate model simulation) where the true answer is known.

We propose and test an alternative method for identifying these oscillations, which makes use of the climate simulations used in the most recent IPCC report (the so-called “CMIP5” simulations). These simulations are used to estimate the component of temperature changes due to increasing greenhouse gas concentrations and other human impacts plus the effects of volcanic eruptions and observed changes in solar output. When all those influences are removed, the only thing remaining should be internal oscillations. We show that our method gives the correct answer when tested with climate model simulations.

2015-02-12-Sci15FigHuffPost.png
Estimated history of the “AMO” (blue), the “PMO (green) and the “NMO” (black). Uncertainties are indicated by shading. Note how the AMO (blue) has reached a shallow peak recently, while the PMO is plummeting quite dramatically. The latter accounts for the precipitous recent drop in the NMO.

Applying our method to the actual climate observations (see figure above) we find that the NMO is currently trending downward. In other words, the internal oscillatory component is currently offsetting some of the Northern Hemisphere warming that we would otherwise be experiencing. This finding expands upon our previous work coming to a similar conclusion, but in the current study we better pinpoint the source of the downturn. The much-vaunted AMO appears to have made relatively little contribution to large-scale temperature changes over the past couple decades. Its amplitude has been small, and it is currently relatively flat, approaching the crest of a very shallow upward peak. That contrasts with the PMO, which is trending sharply downward. It is that decline in the PMO (which is tied to the predominance of cold La Niña-like conditions in the tropical Pacific over the past decade) that appears responsible for the declining NMO, i.e. the slowdown in warming or “faux pause” as some have termed it.

Our conclusion that natural cooling in the Pacific is a principal contributor to the recent slowdown in large-scale warming is consistent with some other recent studies, including a study I commented on previously showing that stronger-than-normal winds in the tropical Pacific during the past decade have lead to increased upwelling of cold deep water in the eastern equatorial Pacific. Other work by Kevin Trenberth and John Fasullo of the National Center for Atmospheric Research (NCAR) shows that the there has been increased sub-surface heat burial in the Pacific ocean over this time frame, while yet another study by James Risbey and colleagues demonstrates that model simulations that most closely follow the observed sequence of El Niño and La Niña events over the past decade tend to reproduce the warming slowdown.

It is possible that the downturn in the PMO itself reflects a “dynamical response” of the climate to global warming. Indeed, I have suggested this possibility before. But the state-of-the-art climate model simulations analyzed in our current study suggest that this phenomenon is a manifestation of purely random, internal oscillations in the climate system.

This finding has potential ramifications for the climate changes we will see in the decades ahead. As we note in the last line of our article,

Given the pattern of past historical variation, this trend will likely reverse with internal variability, instead adding to anthropogenic warming in the coming decades.

That is perhaps the most worrying implication of our study, for it implies that the “false pause” may simply have been a cause for false complacency, when it comes to averting dangerous climate change.

El Niño or Bust

Filed under: — mike @ 8 May 2014

Guest commentary from Michelle L’Heureux, NOAA Climate Prediction Center

Much media attention has been directed at the possibility of an El Niño brewing this year. Many outlets have drawn comparison with the 1997-98 super El Niño. So, what are the odds that El Niño will occur? And if it does, how strong will it be?

To track El Niño, meteorologists at the NOAA/NWS Climate Prediction Center (CPC) release weekly and monthly updates on the status of the El Niño-Southern Oscillation (ENSO). The International Research Institute (IRI) for Climate and Society partner with us on the monthly ENSO release and are also collaborators on a brand new “ENSO blog” which is part of www.climate.gov (co-sponsored by the NOAA Climate Programs Office).

Blogging ENSO is a first for operational ENSO forecasters, and we hope that it gives us another way to both inform and interact with our users on ENSO predictions and impacts. In addition, we will collaborate with other scientists to profile interesting ENSO research and delve into the societal dimensions of ENSO.

As far back as November 2013, the CPC and the IRI have predicted an elevated chance of El Niño (relative to historical chance or climatology) based on a combination of model predictions and general trends over the tropical Pacific Ocean. Once the chance of El Niño reached 50% in March 2014, an El Niño Watch was issued to alert the public that conditions are more favorable for the development of El Niño.
Current forecasts for the Nino-3.4 SST index (as of 5 May 2014) from the NCEP Climate Forecast System version 2 model.
Current forecasts for the Nino-3.4 SST index (as of 5 May 2014) from the NCEP Climate Forecast System version 2 model

More recently, on May 8th, the CPC/IRI ENSO team increased the chance that El Niño will develop, with a peak probability of ~80% during the late fall/early winter of this year. El Nino onset is currently favored sometime in the early summer (May-June-July). At this point, the team remains non-committal on the possible strength of El Niño preferring to watch the system for at least another month or more before trying to infer the intensity. But, could we get a super strong event? The range of possibilities implied by some models allude to such an outcome, but at this point the uncertainty is just too high. While subsurface heat content levels are well above average (March was the highest for that month since 1979 and April was the second highest), ENSO prediction relies on many other variables and factors. We also remain in the spring prediction barrier, which is a more uncertain time to be making ENSO predictions.

Could El Niño predictions fizzle? Yes, there is roughly a 2 in 10 chance at this point that this could happen. It happened in 2012 when an El Nino Watch was issued, chances became as high as 75% and El Niño never formed. Such is the nature of seasonal climate forecasting when there is enough forecast uncertainty that “busts” can and do occur. In fact, more strictly, if the forecast probabilities are “reliable,” an event with an 80% chance of occurring should only occur 80% of the time over a long historical record. Therefore, 20% of the time the event must NOT occur (click here for a description of verification techniques).

While folks might prefer total certainty in our forecasts, we live in an uncertain world. El Niño is most likely to occur this year, so please stay attentive to the various updates linked above and please visit our brand new ENSO blog.