With the official numbers now in 2015 is, by a substantial margin, the new record-holder, the warmest year in recorded history for both the globe and the Northern Hemisphere. The title was sadly short-lived for previous record-holder 2014. And 2016 could be yet warmer if the current global warmth persists through the year.
One might well wonder: just how likely is it that we would be seeing these sort of streaks of record-breaking temperatures if not for human-caused warming of the planet?
Precisely that question was posed by several media organizations a year ago, in the wake of the then-record 2014 temperatures. Various press accounts reported odds anywhere from 1-in-27 million to 1-in-650 million that the observed run of global temperature records (9 of the 10 warmest years and 13 of the 15 warmest years each having had occurred since 2000) might have resulted from chance alone, i.e. without any assistance from human-caused global warming.
My colleagues and I suspected the odds quoted were way too slim. The problem is that each year was treated as though it were statistically independent of neighboring years (i.e. that each year is uncorrelated with the year before it or after it), but that’s just not true. Temperatures don’t vary erratically from one year to the next. Natural variations in temperature wax and wane over a period of several years.
For example, we’ve had a couple very warm years in a row now due in part to El Niño-ish conditions that have persisted since late 2013 and it is likely that the current El Niño event will boost 2016 temperatures as well. That is an example of a natural variation that is internally-generated. There are also natural variations in temperature that are externally-caused or ‘forced’, e.g. the multi-year cooling impact of large, explosive volcanic eruptions like the 1991 Mt. Pinatubo eruption, or the small-but-measurable changes in solar output that occur on timescales of a decade or longer. Each of these natural sources of temperature variation lead to correlations in temperature from one year to the next that would be present even in the absence of global warming. These correlations must be taken into account to get reliable answers to the questions being posed.
The particular complication at hand is referred to, in the world of statistics as “serial correlation” or “autocorrelation“. In this case, it means that the effective size of the temperature dataset is considerably smaller than one would estimate based purely on the number of years available. There are N=136 years of annual global temperature data from 1880-2015. However, when the natural correlations between neighboring years are accounted for, the effective size of the sample is a considerably smaller N’~30. That means that warm and cold periods tend to occur in stretches of roughly 4 years at a time. Runs of several cold or warm years are far more likely to happen based on chance alone than one would estimate under the incorrect assumption that natural temperature fluctuations are uncorrelated from one year to the next.
One can account for such effects by using a more sophisticated statistical model that faithfully reproduces the characteristics of actual natural climate variability. My co-authors and I used such an approach to more rigorously assess the likelihood of recent runs of record-breaking temperatures. We have now reported our findings in an article just published in the Nature journal Scientific Reports. With the study having come out shortly after New Year, we are able to update the results from the study to include the record new 2015 temperatures.
Our approach combines information from the state-of-the-art climate model simulations used in the most recent report of the Intergovernmental Panel on Climate Change (IPCC) with historical observations of global and Northern Hemisphere (NH) average temperature. Averaging over the various model simulations provides an estimate of the ‘forced’ component of temperature change–the component that is driven by external natural (i.e. volcanic and solar) and human (emission of greenhouse gases and pollutants) factors.
Using the statistical model, we generate a million alternative versions of the original series, called ‘surrogates’, each of which has the same basic statistical properties as the original series, but differs in its historical details, i.e. the magnitude and sequence of individual annual temperature values. Adding the forced component of natural temperature change (due to volcanic and solar impacts) to each of these surrogates yields an ensemble of a million surrogates for the total natural component of temperature variation.
These surrogates can be compared with the estimated natural component of the actual NH series as well as the full NH series itself (Figure 2). Tabulating results from the surrogates, which provide a million alternative, realistic scenarios of natural climate variability, we are able to diagnose how often a given run of record temperatures is likely to have arisen naturally. Our just-published study, having been completed prior to 2015, analyzed the data available through 2014, assessing the likelihood of 9 of the warmest 10 and 13 of the warmest 15 years having each occurred since 2000.
Updating the analysis to include 2015, we find that the record temperature run is even less likely to have arisen from natural variability. The odds are no greater than 1-in-300,000 that 14 of the 16 warmest years would have occurred since 2000 for the NH. The odds of back-to-back records (something we haven’t seen in several decades) as witnessed with 2014 and 2015, is roughly 1-in-1500.
We can also use the surrogates to assess the likelihoods of individual annual temperature records, such as those that occurred during 1998, 2005, 2010, 2014 and now 2015. Here we require not only that particular years are warmer than certain previous years, but that they reach a particular threshold of warmth. This is even less likely to happen in the absence of global warming, for reasons that are obvious from Figure 2: the natural temperature series almost never exceeds a maximum value of 0.4C relative to the long-term average, while the warmest actual year–2015–exceeds 1C. For none of the record-setting years–1998, 2005, 2010, 2014, or 2015–do the odds exceed 1-in-a-million for temperatures having reached the levels they did due to chance alone, for either the NH or global mean temperature.
Using data through 2014, we estimate a 76% likelihood that 13 of the warmest 15 years would occur since 2000 for the NH. Updating the analysis to include 2015, we find there is a 76% likelihood that 14 of the 16 years would occur since 2000 as well. The likelihood of back-to-back records during the two most recent years 2014 and 2015 is just over 8%, still a bit of a fluke, but hardly out of the question.
As for individual record years, we find that the 1998, 2005, 2010, 2014, and 2015 records had likelihoods of 7%, 18%, 23%, 40% and 7% respectively. So while the 2014 temperature record had nearly even odds of occurring, the 2015 record had relatively long odds.
There is good reason for that. The 2015 temperature didn’t just beat the previous record, but smashed it, coming in nearly 0.2C warmer than 2014. The 2015 warmth was boosted by an unusually large El Niño event–indeed, by some measures, the largest on record. A similar story holds for 1998 which, prior to 2015, was itself the largest El Niño on record. It too boosted 1998 warmth, which beat the previous record (1995) again by a whopping 0.2C. Each of the two monster El Niño events was, in a statistical sense, somewhat of a fluke. And each of them imparted considerably greater large-scale warmth than would have been expected from global warming alone.
That analysis, however, neglects one intriguing possibility. Could it be that human-caused climate change is actually boosting the magnitude of El Niño events themselves, leading to more monster events like the ’98 and ’15 events? That proposition indeed finds some support in the recent peer-reviewed literature. If the hypothesis turns out to be true, then the record warmth of ’98 and ’15 might not have been flukes after all.
To summarize, we find that the various record temperatures and runs of unusually warm years since 2000 are extremely unlikely to have happened in the absence of human-caused climate change, and reasonably likely to have happened when we account for climate change. We can, in this sense, attribute the record warmth to human-caused climate change at a high level of confidence.
Will the onslaught of record-breaking temperatures finally put the discredited “global warming has stopped” talking point to rest? Alas, probably not.
But the next time you hear someone call into question the threat of human-caused climate change, you might explain to them that the likelihood we would be witnessing the recent record warmth in the absence of human-caused climate change is somewhere between one-in-a-thousand and one-in-a-million. You might ask them: Would you really gamble away the future of our planet with those sorts of odds?