RealClimate http://www.realclimate.org Climate science from climate scientists... Thu, 17 Apr 2014 08:59:43 +0000 en hourly 1 http://wordpress.org/?v=3.8.3 Mitigation of Climate Change – Part 3 of the new IPCC report http://www.realclimate.org/index.php/archives/2014/04/mitigation-of-climate-change-part-3-of-the-new-ipcc-report/ http://www.realclimate.org/index.php/archives/2014/04/mitigation-of-climate-change-part-3-of-the-new-ipcc-report/#comments Thu, 17 Apr 2014 08:56:03 +0000 http://www.realclimate.org/?p=17217

Brigitte Knopf_441B9424_Sep2012_web

 

 

 

Guest post by Brigitte Knopf

 

 

 

 

 

 

Global emissions continue to rise further and this is in the first place due to economic growth and to a lesser extent to population growth. To achieve climate protection, fossil power generation without CCS has to be phased out almost entirely by the end of the century. The mitigation of climate change constitutes a major technological and institutional challenge. But: It does not cost the world to save the planet.

This is how the new report was summarized by Ottmar Edenhofer, Co-Chair of Working Group III of the IPCC, whose report was adopted on 12 April 2014 in Berlin after intense debates with governments. The report consists of 16 chapters with more than 2000 pages. It was written by 235 authors from 58 countries and reviewed externally by 900 experts. Most prominent in public is the 33-page Summary for Policymakers (SPM) that was approved by all 193 countries. At a first glance, the above summary does not sound spectacular but more like a truism that we’ve often heard over the years. But this report indeed has something new to offer.

The 2-degree limit

For the first time, a detailed analysis was performed of how the 2-degree limit can be kept, based on over 1200 future projections (scenarios) by a variety of different energy-economy computer models. The analysis is not just about the 2-degree guardrail in the strict sense but evaluates the entire space between 1.5 degrees Celsius, a limit demanded by small island states, and a 4-degree world. The scenarios show a variety of pathways, characterized by different costs, risks and co-benefits. The result is a table with about 60 entries that translates the requirements for limiting global warming to below 2-degrees into concrete numbers for cumulative emissions and emission reductions required by 2050 and 2100. This is accompanied by a detailed table showing the costs for these future pathways.

The IPCC represents the costs as consumption losses as compared to a hypothetical ‘business-as-usual’ case. The table does not only show the median of all scenarios, but also the spread among the models. It turns out that the costs appear to be moderate in the medium-term until 2030 and 2050, but in the long-term towards 2100, a large spread occurs and also high costs of up to 11% consumption losses in 2100 could be faced under specific circumstances. However, translated into reduction of growth rate, these numbers are actually quite low. Ambitious climate protection would cost only 0.06 percentage points of growth each year. This means that instead of a growth rate of about 2% per year, we would see a growth rate of 1.94% per year. Thus economic growth would merely continue at a slightly slower pace. However, and this is also said in the report, the distributional effects of climate policy between different countries can be very large. There will be countries that would have to bear much higher costs because they cannot use or sell any more of their coal and oil resources or have only limited potential to switch to renewable energy.

The technological challenge

Furthermore – and this is new and important compared to the last report of 2007 – the costs are not only shown for the case when all technologies are available, but also how the costs increase if, for example, we would dispense with nuclear power worldwide or if solar and wind energy remain more expensive than expected.

The results show that economically and technically it would still be possible to remain below the level of 2-degrees temperature increase, but it will require rapid and global action and some technologies would be key:

Many models could not achieve atmospheric concentration levels of about 450 ppm CO2eq by 2100, if additional mitigation is considerably delayed or under limited availability of key technologies, such as bioenergy, CCS, and their combination (BECCS).

Probably not everyone likes to hear that CCS is a very important technology for keeping to the 2-degree limit and the report itself cautions that CCS and BECCS are not yet available at a large scale and also involve some risks. But it is important to emphasize that the technological challenges are similar for less ambitious temperature limits.

The institutional challenge

Of course, climate change is not just a technological issue but is described in the report as a major institutional challenge:

Substantial reductions in emissions would require large changes in investment patterns

Over the next two decades, these investment patterns would have to change towards low-carbon technologies and higher energy efficiency improvements (see Figure 1). In addition, there is a need for dedicated policies to reduce emissions, such as the establishment of emissions trading systems, as already existent in Europe and in a handful of other countries.

Since AR4, there has been an increased focus on policies designed to integrate multiple objectives, increase co‐benefits and reduce adverse side‐effects.

The growing number of national and sub-national policies, such as at the level of cities, means that in 2012, 67% of global GHG emissions were subject to national legislation or strategies compared to  only 45% in 2007. Nevertheless, and that is clearly stated in the SPM, there is no trend reversal of emissions within sight – instead a global increase of emissions is observed.

IPCC_WG3_SPM_Figure_9

Figure 1: Change in annual investment flows from the average baseline level over the next two decades (2010 to 2029) for mitigation scenarios that stabilize concentrations within the range of approximately 430–530 ppm CO2eq by 2100. Source: SPM, Figure SPM.9

 

Trends in emissions

A particularly interesting analysis, showing from which countries these emissions originate, was removed from the SPM due to the intervention of some governments, as it shows a regional breakdown of emissions that was not in the interest of every country (see media coverage here or here). These figures are still available in the underlying chapters and the Technical Summary (TS), as the government representatives may not intervene here and science can speak freely and unvarnished. One of these figures shows very clearly that in the last 10 years emissions in countries of upper middle income – including, for example, China and Brazil – have increased while emissions in high-income countries – including Germany – stagnate, see Figure 2. As income is the main driver of emissions in addition to the population growth, the regional emissions growth can only be understood by taking into account the development of the income of countries.

Historically, before 1970, emissions have mainly been emitted by industrialized countries. But with the regional shift of economic growth now emissions have shifted to countries with upper middle income, see Figure 2, while the industrialized countries have stabilized at a high level. The condensed message of Figure 2 does not look promising: all countries seem to follow the path of the industrialized countries, with no “leap-frogging” of fossil-based development directly to a world of renewables and energy efficiency being observed so far.

AR5_figure_TS.4

Figure 2: Trends in GHG emissions by country income groups. Left panel: Total annual anthropogenic GHG emissions from 1970 to 2010 (GtCO2eq/yr). Middle panel: Trends in annual per capita mean and median GHG emissions from 1970 to 2010 (tCO2eq/cap/yr). Right panel: Distribution of annual per capita GHG emissions in 2010 of countries within each income group (tCO2/cap/yr). Source: TS, Figure TS.4

 

But the fact that today’s emissions especially rise in countries like China is only one side of the coin. Part of the growth in CO2 emissions in the low and middle income countries is due to the production of consumption goods that are intended for export to the high-income countries (see Figure 3). Put in plain language: part of the growth of Chinese emissions is due to the fact that the smartphones used in Europe or the US are produced in China.

AR5_figure_TS.5

Figure 3: Total annual CO2 emissions (GtCO2/yr) from fossil fuel combustion for country income groups attributed on the basis of territory (solid line) and final consumption (dotted line). The shaded areas are the net CO2 trade balance (difference) between each of the four country income groups and the rest of the world. Source: TS, Figure TS.5

 

The philosophy of climate change

Besides all the technological details there has been a further innovation in this report, that is the chapter on “Social, economic and ethical concepts and methods“. This chapter could be called the philosophy of climate change. It emphasizes that

Issues of equity, justice, and fairness arise with respect to mitigation and adaptation. […] Many areas of climate policy‐making involve value judgements and ethical considerations.

This implies that many of these issues cannot be answered solely by science, such as the question of a temperature level that avoids dangerous anthropogenic interference with the climate system or which technologies are being perceived as risky. It means that science can provide information about costs, risks and co-benefits of climate change but in the end it remains a social learning process and debate to find the pathway society wants to take.

Conclusion

The report contains many more details about renewable energies, sectoral strategies such as in the electricity and transport sector, and co-benefits of avoided climate change, such as improvements of air quality. The aim of Working Group III of the IPCC was, and the Co-Chair emphasized this several times, that scientists are mapmakers that will help policymakers to navigate through this difficult terrain in this highly political issue of climate change. And this without being policy prescriptive about which pathway should be taken or which is the “correct” one. This requirement has been fulfilled and the map is now available. It remains to be seen where the policymakers are heading in the future.

 

The report :

Climate Change 2014: Mitigation of Climate Change – IPCC Working Group III Contribution to AR5

 

Brigitte Knopf is head of the research group Energy Strategies Europe and Germany at the Potsdam Institute for Climate Impact Research (PIK) and one of the authors of the report of the IPCC Working Group III and is on Twitter as @BrigitteKnopf

This article was translated from the German original at RC’s sister blog KlimaLounge.

 

Reaclimate coverage of the IPCC 5th Assessment Report:

Summary of Part 1, Physical Science Basis

Summary of Part 2, Impacts, Adaptation, Vulnerability

Summary of Part 3, Mitigation

Sea-level rise in the AR5

Attribution of climate change to human causes

Radiative forcing of climate change

]]>
http://www.realclimate.org/index.php/archives/2014/04/mitigation-of-climate-change-part-3-of-the-new-ipcc-report/feed/langswitch_lang/en/ 88
Shindell: On constraining the Transient Climate Response http://www.realclimate.org/index.php/archives/2014/04/shindell-on-constraining-the-transient-climate-response/ http://www.realclimate.org/index.php/archives/2014/04/shindell-on-constraining-the-transient-climate-response/#comments Tue, 08 Apr 2014 12:25:58 +0000 http://www.realclimate.org/?p=17134

Guest commentary from Drew Shindell

There has been a lot of discussion of my recent paper in Nature Climate Change (Shindell, 2014). That study addressed a puzzle, namely that recent studies using the observed changes in Earth’s surface temperature suggested climate sensitivity is likely towards the lower end of the estimated range. However, studies evaluating model performance on key observed processes and paleoclimate evidence suggest that the higher end of sensitivity is more likely, partially conflicting with the studies based on the recent transient observed warming. The new study shows that climate sensitivity to historical changes in the abundance of aerosol particles in the atmosphere is larger than the sensitivity to CO2, primarily because the aerosols are largely located near industrialized areas in the Northern Hemisphere middle and high latitudes where they trigger more rapid land responses and strong snow & ice feedbacks. Therefore studies based on observed warming have underestimated climate sensitivity as they did not account for the greater response to aerosol forcing, and multiple lines of evidence are now consistent in showing that climate sensitivity is in fact very unlikely to be at the low end of the range in recent estimates.

In particular, a criticism of the paper written by Nic Lewis has gotten some attention. Lewis makes a couple of potentially interesting points, chief of which concern the magnitude and uncertainty in the aerosol forcing I used and the time period over which the calculation is done, and I address these issues here. There are also a number of less substantive points in his piece that I will not bother with.

Lewis states that “The extensive adjustments made by Shindell to the data he uses are a source of concern. One of those adjustments is to add +0.3 W/m² to the figures used for model aerosol forcing to bring the estimated model aerosol forcing into line with the AR5 best estimate of -0.9 W/m².” Indeed the estimate of aerosol forcing used in the calculation of transient climate response (TCR) in the paper does not come directly from climate models, but instead incorporates an adjustment to those models so that the forcing better matches the assessed estimates from the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC). An adjustment is necessary because as climate models are continually evaluated against observations evidence has become emerged that the strength of their aerosol-cloud interactions are too strong (i.e. the models’ ‘aerosol indirect effect’ is larger than inferred from observations). There have been numerous papers on this topic and this issue was thoroughly assessed in IPCC AR5 chapter 7. The assessed best estimate was that the historical negative aerosol forcing (radiation and cloud effects, but not black carbon on snow/ice) was too strong by about 0.3 Wm-2 in the models that included that effect, a conclusion very much in line with a prior publication on climate sensitivity by Otto et al. (2013). Given numerous scientific studies on this topic, there is ample support for the conclusion that models overestimate the magnitude of aerosol forcing, though the uncertainty in aerosol forcing (which is incorporated into the analysis in the paper) is large, especially in comparison with CO2 forcing which can be better constrained by observations.

The second substantive point Lewis raised relates to the time period over which the TCR is evaluated. The IPCC emphasizes forcing estimates relative to 1750 since most of the important anthropogenic impacts are thought to have been small at that time (biomass burning may be an exception, but appears to have a relatively small net forcing). Surface temperature observations become sparser going back further in time, however, and the most widely used datasets only go back to 1880 or 1850. Radiative forcing, especially that due to aerosols, is highly uncertain for the period 1750-1850 as there is little modeling and even less data to constrain those models. The AR5 gives a value for 1850 aerosol forcing (relative to 1750) (Annex II, Table AII.1.2) of -0.178 W/m² for direct+indirect (radiation+clouds). There is also a BC snow forcing of 0.014 W/m², for a total of -0.164 W/m². While these estimates are small, they are nonetheless very poorly constrained.

Hence there are two logical choices for an analysis of TCR. One could assume that there was minimal global mean surface temperature change between 1750 and 1850, as some datasets suggest, and compare the 1850-2000 temperature change with the full 1750-2000 forcing estimate, as in my paper and Otto et al. In this case, aerosol forcing over 1750-2000 is used.

Alternatively, one could assume we can estimate forcing during this early period realistically enough to remove if from the longer 1750-2000 estimates, and so compare forcing and response over 1850-2000. In this case, this must be done for all forcings, not just for the aerosols. The well-mixed greenhouse gas forcing in 1850 is 0.213 W/m². Including well-mixed solar and stratospheric water that becomes 0.215 W/m². LU and ozone almost exactly cancel one another. So to adjust from 1750-2000 to 1850-2000 forcings, one must remove 0.215 W/m² and also remove the -0.164 W/m² aerosol forcing, multiplying the latter by it’s impact relative to that of well-mixed greenhouse gases (~1.5) that gives about -0.25 W/m².

If this is done consistently, the denominator of the climate sensitivity calculation containing total forcing barely changes and hence the TCR results are essentially the same (a change of only 0.03°C). Lewis’ claim that the my TCR results are mistaken because they did not account for 1750-1850 aerosol forcing is incorrect because he fails to use consistent time periods for all forcing agents. The results are in fact quite robust to either analysis option provided they are done consistently.

Lewis also discusses the uncertainty in aerosol forcing and in the degree to which the response to aerosols are enhanced relative to the response to CO2. Much of this discussion follows a common pattern of looking through the peer-reviewed paper to find all the caveats and discussion points, and then repeating them back as if they undermine the paper’s conclusions rather than reflecting that they are uncertainties that were already taken into account. It is important to realize that the results presented in the paper include both the uncertainty in the aerosol forcing and the uncertainty in the enhancement of the response to aerosol forcing, as explicitly stated. Hence any statement that the uncertainty is underestimated in the results presented in the paper, due to the fact that (included) uncertainty in these two components is large, is groundless.

In fact, this is an important issue to keep in mind as Lewis also argues that the climate models do not provide good enough information to determine the value of the enhanced aerosol response (the parameter I call E in the paper, where E is the ratio of the global mean temperature response to aerosol forcing versus the response to the same global mean magnitude of CO2 forcing, so that E=1.5 would be a 50% stronger response to aerosols). While the models indeed are imperfect and have uncertainties, they provide the best available method we have to determine the value of E as this cannot be isolated from observations directly. Furthermore, basic physical understanding supports the modeled value of E being substantially greater than 1, as deep oceans clearly take longer to respond than the land surface, so the Northern Hemisphere, with most of the world’s land, will respond more rapidly than the Southern Hemisphere with more ocean. Quantifying the value of E accurately is difficult, and the variation across the models is substantial, primarily reflecting our incomplete knowledge of aerosol forcing. This leads to a range of E quoted in the paper of 1.18 to 2.43. I used this range, assuming a lognormal distribution, along with the mean value of 1.53, in the calculation for the TCR.

Lewis then argues that the large uncertainty ranges in E and in aerosol forcing make it the TCR estimates “worthless”. While “worthless” is a little strong, it is important to fully assess uncertainties in trying to constrain any properties in the real world. It’s worthwhile to note that Lewis co-authored a recent report claiming that TCR could in fact be constrained to be low. That report relies on studies that include the large aerosol forcing uncertainty, so criticizing my paper for that would be inconsistent. However, Lewis’ study assumed that all forcings induce the same response in global mean temperature as CO2. This is equivalent to assuming that E is exactly 1.0 with NO uncertainty whatsoever. This is a reasonable first guess in the absence of evidence to the contrary, but as my paper recently showed, there is evidence to indicate that assumption is biased.

But while Lewis argues that the uncertainty in E is large and climate models do not give the value as accurately as we’d like, that does not justify ignoring that uncertainty entirely. Instead, we need to characterize that uncertainty as best we can and propagate that through the calculation (as can be seen in the figure below). The real question is not whether climate models provide us perfect information (they do not), but rather whether they provide better information than some naïve prior assumption. In this case, it is clear that they do.



Figure shows representative probability distribution functions for TCR using the numbers from Shindell (2014) in a Monte Carlo calculation (Gaussian for Fghg and dTobs, lognormal fits for the skewed distributions for Faerosol+ozone+LU and E). The green line is if you assume exactly no difference between the effects of aerosols and GHGs; Red is if you estimate that difference using climate models; Dashed red is the small difference made by using a different start date (1850 instead of 1750).

This highlights the critical distinction in our reasoning: I fully support the basic methods used in prior work such as Otto et al and have simply quantified an additional physical factor in the existing methodology. I am however confused that Lewis, on one hand, appears to now object to the basic method used in prior work in which the authors first adjusted aerosol forcing, second included it’s uncertainty, and then finally quantified estimates of TCR, Yet on the other hand, he not only co-authored the Otto et al paper but released a report praising that study just three days before the publication of my paper.

For completeness, I should acknowledge that Lewis correctly identified a typo in the last row of the first column of Table S2, which has been corrected in the version posted where there is also access to the computer codes used in the calculations. The climate model output itself is already publicly available at the CMIP5 website (also linked at that page).

Finally, I note that the conclusions of the paper send a sobering message. It would be nice if sensitivity was indeed quite low and society could get away with smaller emission cuts to stabilize climate. Unfortunately, several lines of independent evidence now agree that this is not the case.

References

  1. D.T. Shindell, "Inhomogeneous forcing and transient climate sensitivity", Nature Climate change, vol. 4, pp. 274-277, 2014. http://dx.doi.org/10.1038/nclimate2136
  2. A. Otto, F.E.L. Otto, O. Boucher, J. Church, G. Hegerl, P.M. Forster, N.P. Gillett, J. Gregory, G.C. Johnson, R. Knutti, N. Lewis, U. Lohmann, J. Marotzke, G. Myhre, D. Shindell, B. Stevens, and M.R. Allen, "Energy budget constraints on climate response", Nature Geosci, vol. 6, pp. 415-416, 2013. http://dx.doi.org/10.1038/ngeo1836
]]>
http://www.realclimate.org/index.php/archives/2014/04/shindell-on-constraining-the-transient-climate-response/feed/langswitch_lang/en/ 34
Unforced variations: Apr 2014 http://www.realclimate.org/index.php/archives/2014/04/unforced-varaitions-apr-2014/ http://www.realclimate.org/index.php/archives/2014/04/unforced-varaitions-apr-2014/#comments Sun, 06 Apr 2014 15:02:40 +0000 http://www.realclimate.org/?p=17149

More open thread. Unusually, we are keeping the UV Mar 2014 thread open for more Diogenetic conversation and to keep this thread open for more varied fare.

]]>
http://www.realclimate.org/index.php/archives/2014/04/unforced-varaitions-apr-2014/feed/langswitch_lang/en/ 159
Impacts of Climate Change – Part 2 of the new IPCC Report has been approved http://www.realclimate.org/index.php/archives/2014/04/impacts-of-climate-change-part-2-of-the-new-ipcc-report-has-been-approved/ http://www.realclimate.org/index.php/archives/2014/04/impacts-of-climate-change-part-2-of-the-new-ipcc-report-has-been-approved/#comments Fri, 04 Apr 2014 08:41:36 +0000 http://www.realclimate.org/?p=17119

The second part of the new IPCC Report has been approved – as usual after lengthy debates – by government delegations in Yokohama (Japan) and is now public. Perhaps the biggest news is this: the situation is no less serious than it was at the time of the previous report 2007. Nonetheless there is progress in many areas, such as a better understanding of observed impacts worldwide and of the specific situation of many developing countries. There is also a new assessment of “smart” options for adaptation to climate change. The report clearly shows that adaptation is an option only if efforts to mitigate greenhouse gas emissions are strengthened substantially. Without mitigation, the impacts of climate change will be devastating.

cramer

 

 

Guest post by Wolfgang Cramer

 

 

On all continents and across the oceans

Impacts of anthropogenic climatic change are observed worldwide and have been linked to observed climate using rigorous methods. Such impacts have occurred in many ecosystems on land and in the ocean, in glaciers and rivers, and they concern food production and the livelihoods of people in developing countries. Many changes occur in combination with other environmental problems (such as urbanization, air pollution, biodiversity loss), but the role of climate change for them emerges more clearly than before.

abb1

Fig. 1 Observed impacts of climate change during the period since publication of the IPCC Fourth Assessment Report 2007

 

During the presentation for approval of this map in Yokohama many delegates asked why there are not many more impacts on it. This is because authors only listed those cases where solid scientific analysis allowed attribution. An important implication of this is that absence of icons from the map may well be due to lacking data (such as in parts of Africa) – and certainly does not imply an absence of impacts in reality. Compared to the earlier report in 2007, a new element of these documented findings is that impacts on crop yields are now clearly identified in many regions, also in Europe. Improved irrigation and other technological advances have so far helped to avoid shrinking yields in many cases – but the increase normally expected from technological improvements is leveling off rapidly.

 

A future of increasing risks

More than previous IPCC reports, the new report deals with future risks. Among other things, it seeks to identify those situations where adaptation could become unfeasible and damages therefore become inevitable. A general finding is that “high” scenarios of climate change (those where global mean temperature reaches four degrees C or more above preindustrial conditions – a situation that is not at all unlikely according to part one of the report) will likely result in catastrophic impacts on most aspects of human life on the planet.

abb2

Fig. 2 Risks for various systems with high (blue) or low (red) efforts in climate change mitigation

 

These risks concern entire ecosystems, notably those of the Arctic and the corals of warm waters around the world (the latter being a crucial resource for fisheries in many developing countries), the global loss of biodiversity, but also the working conditions for many people in agriculture (the report offers many details from various regions). Limiting global warming to 1.5-2.0 degrees C through aggressive emission reductions would not avoid all of these damages, but the risks would be significantly lower (a similar chart has been shown in earlier reports, but the assessment of risks is now, based on the additional scientific knowledge available, more alarming than before, a point that is expressed most prominently by the deep red color in the first bar).

 

Food security increasingly at risk

In the short term, warming may improve agricultural yields in some cooler regions, but significant reductions are highly likely to dominate in later decades of the present century, particularly for wheat, rice and maize. The illustration is an example of the assessment of numerous studies in the scientific literature, showing that, from 2030 onwards, significant losses are to be expected. This should be seen in the context of already existing malnutrition in many regions, a growing problem also in the absence of climate change, due to growing populations, increasing economic disparities and the continuing shift of diet towards animal protein.

abb3

Fig. 3 Studies indicating increased crop yields (blue) or reduced crop yields (brown), accounting for various scenarios of climate change and technical adaptation

 

The situation for global fisheries is comparably bleak. While some regions, such as the North Atlantic, might allow larger catches, there is a loss of marine productivity to be expected in nearly all tropical waters, caused by warming and acidification. This affects poor countries in South-East Asia and the Pacific in particular. Many of these countries will also be affected disproportionately by the consequences of sea-level rise for coastal mega-cities.

abb4

Fig. 4 Change in maximum fish catch potential 2051-2060 compared to 2001-2010 for the climate change scenario SRES A1B

 

Urban areas in developing countries particularly affected

Nearly all developing countries experience significant growth in their mega-cities – but it is here that higher temperatures and limited potential for technical adaptation have the largest effect on people. Improved urban planning, focusing on the resilience of residential areas and transport systems of the poor, can deliver important contributions to adaptation. This would also have to include better preparation for the regionally rising risks from typhoons, heat waves and floods.

 

Conflicts in a warmer climate

It has been pointed out that no direct evidence is available to connect the occurrence of violent conflict to observed climate change. But recent research has shown that it is likely that dry and hot periods may have been contributing factors. Studies also show that the use of violence increases with high temperatures in some countries. The IPCC therefore concludes that enhanced global warming may significantly increase risks of future violent conflict.

 

Climate change and the economy

Studies estimate the impact of future climate change as around few percent of global income, but these numbers are considered hugely uncertain. More importantly, any economic losses will be most tangible for countries, regions and social groups already disadvantaged compared to others. It is therefore to be expected that economic impacts of climate change will push large additional numbers of people into poverty and the risk of malnutrition, due to various factors including increase in food prices.

 

Options for adaptation to the impacts of climate change

The report underlines that there is no globally acceptable “one-fits-all” concept for adaptation. Instead, one must seek context-specific solutions. Smart solutions can provide opportunities to enhance the quality of life and local economic development in many regions – this would then also reduce vulnerabilities to climate change. It is important that such measures account for cultural diversity and the interests of indigenous people. It also becomes increasingly clear that policies that reduce emissions of greenhouse gases (e.g., by the application of more sustainable agriculture techniques or the avoidance of deforestation) need not be in conflict with adaptation to climate change. Both can improve significantly the livelihoods of people in developing countries, as well as their resilience to climate change.

It is beyond doubt that unabated climate change will exhaust the potential for adaptation in many regions – particularly for the coastal regions in developing countries where sea-level rise and ocean acidification cause major risks.

The summary of the report is found here. Also the entire report with all underlying chapters is online. Further there is a nicely crafted background video.

Wolfgang Cramer is scientific director of the Institut Méditerranéen de Biodiversité et d’Ecologie marine et continentale (IMBE) in Aix-en-Provence one of the authors of the IPCC  working group 2 report.

This article was translated from the German original at RC’s sister blog KlimaLounge.

 

Weblink

Here is our summary of part 1 of the IPCC report.

]]>
http://www.realclimate.org/index.php/archives/2014/04/impacts-of-climate-change-part-2-of-the-new-ipcc-report-has-been-approved/feed/langswitch_lang/en/ 173
IPCC WG2 report now out http://www.realclimate.org/index.php/archives/2014/03/ipcc-wg2-report-now-out/ http://www.realclimate.org/index.php/archives/2014/03/ipcc-wg2-report-now-out/#comments Mon, 31 Mar 2014 01:29:20 +0000 http://www.realclimate.org/?p=17113

Instead of speculations based on partial drafts and attempts to spin the coverage ahead of time, you can now download the final report of the IPCC WG2: “Climate Change 2014:Impacts, Adaptation, and Vulnerability” directly. The Summary for Policy Makers is here, while the whole report is also downloadable by chapter. Notably there are FAQ for the whole report and for each chapter that give a relatively easy way in to the details. Note too that these are the un-copyedited final versions, and minor edits, corrections and coherent figures will be forthcoming in the final published versions. (For reference, the WG1 report was released in Sept 2013, but only in final published form in Jan 2014). Feel free to link to interesting takes on the report in the comments.

]]>
http://www.realclimate.org/index.php/archives/2014/03/ipcc-wg2-report-now-out/feed/langswitch_lang/en/ 130
Unforced variations: Mar 2014. Part II http://www.realclimate.org/index.php/archives/2014/03/unforced-variations-mar-2014-part-ii/ http://www.realclimate.org/index.php/archives/2014/03/unforced-variations-mar-2014-part-ii/#comments Fri, 28 Mar 2014 23:34:32 +0000 http://www.realclimate.org/?p=17103

This is mid-month open-thread for all discussions, except those related to Diogenes’ comments. People wanting to discuss with commenter Diogenes should stick to the previous UV thread. All such discussion on this thread will be moved over. Thanks.

]]>
http://www.realclimate.org/index.php/archives/2014/03/unforced-variations-mar-2014-part-ii/feed/langswitch_lang/en/ 90
The most common fallacy in discussing extreme weather events + Update http://www.realclimate.org/index.php/archives/2014/03/the-most-common-fallacy-in-discussing-extreme-weather-events/ http://www.realclimate.org/index.php/archives/2014/03/the-most-common-fallacy-in-discussing-extreme-weather-events/#comments Tue, 25 Mar 2014 16:40:58 +0000 http://www.realclimate.org/?p=17093

Does global warming make extreme weather events worse? Here is the #1 flawed reasoning you will have seen about this question: it is the classic confusion between absence of evidence and evidence for absence of an effect of global warming on extreme weather events. Sounds complicated? It isn’t. I’ll first explain it in simple terms and then give some real-life examples.

The two most fundamental properties of extreme events are that they are rare (by definition) and highly random. These two aspects (together with limitations in the data we have) make it very hard to demonstrate any significant changes. And they make it very easy to find all sorts of statistics that do not show an effect of global warming – even if it exists and is quite large.

Would you have been fooled by this?

Imagine you’re in a sleazy, smoky pub and a stranger offers you a game of dice, for serious money. You’ve been warned and have reason to suspect they’re using a loaded dice here that rolls a six twice as often as normal. But the stranger says: “Look here, I’ll show you: this is a perfectly normal dice!” And he rolls it a dozen times. There are two sixes in those twelve trials – as you’d expect on average in a normal dice. Are you convinced all is normal?

You shouldn’t be, because this experiment is simply inconclusive. It shows no evidence for the dice being loaded, but neither does it provide real evidence against your prior suspicion that the dice is loaded. There is a good chance for this outcome even if the dice is massively loaded (i.e. with 1 in 3 chance to roll a six). On average you’d expect 4 sixes then, but 2 is not uncommon either. With normal dice, the chance to get exactly two sixes in this experiment is 30%, with the loaded dice it is 13%[i]. From twelve tries you simply don’t have enough data to tell.

Hurricanes

In 2005, leading hurricane expert Kerry Emanuel (MIT) published an analysis showing that the power of Atlantic hurricanes has strongly increased over the past decades, in step with temperature. His paper in the journal Nature happened to come out on the 4th of August – just weeks before hurricane Katrina struck. Critics were quick to point out that the power of hurricanes that made landfall in the US had not increased. While at first sight that might appear to be the more relevant statistic, it actually is a case like rolling the dice only twelve times: as Emanuel’s calculations showed, the number of landfalling storms is simply far too small to get a meaningful result, as those data represent “less than a tenth of a percent of the data for global hurricanes over their whole lifetimes”. Emanuel wrote at the time (and later confirmed in a study): “While we can already detect trends in data for global hurricane activity considering the whole life of each storm, we estimate that it would take at least another 50 years to detect any long-term trend in U.S. landfalling hurricane statistics, so powerful is the role of chance in these numbers.” Like with the dice this is not because the effect is small, but because it is masked by a lot of ‘noise’ in the data, spoiling the signal-to-noise ratio.

Heat records

The number of record-breaking hot months (e.g. ‘hottest July in New York’) around the world is now five times as big as it would be in an unchanging climate. This has been shown by simply counting the heat records in 150,000 series of monthly temperature data from around the globe, starting in the year 1880. Five times. For each such record that occurs just by chance, four have been added thanks to global warming.

You may be surprised (like I was at first) that the change is so big after less than 1 °C global warming – but if you do the maths, you find it is exactly as expected. In 2011, in the Proceedings of the National Academy we described a statistical method for calculating the expected number of monthly heat records given the observed gradual changes in climate. It turns out to be five times the number expected in a stationary climate.

Given that this change is so large, that it is just what is expected and that it can be confirmed by simple counting, you’d expect this to be uncontroversial. Not so. Our paper was attacked with astounding vitriol by Roger Pielke Jr., with repeated false allegations about our method (more on this here).

Barriopedro

European summer temperatures for 1500–2010. Vertical lines show the temperature deviations from average of individual summers, the five coldest and the five warmest are highlighted. The grey histogram shows the distribution for the 1500–2002 period with a Gaussian fit shown in black. That 2010, 2003, 2002, 2006 and 2007 are the warmest summers on record is clearly not just random but a systematic result of a warming climate. But some invariably will rush to the media to proclaim that the 2010 heat wave was a natural phenomenon not linked to global warming. (Graph from Barriopedro et al., Science 2011.)

 

Heat records can teach us another subtle point. Say in your part of the world the number of new heat records has been constant during the past fifty years. So, has global warming not acted to increase their number? Wrong! In a stationary climate, the number of new heat records declines over time. (After 50 years of data, the chance that this year is the hottest is 1/50. After 100 years, this is reduced to 1/100.)  So if the number has not changed, two opposing effects must have kept it constant: the natural decline, and some warming. In fact, the frequency of daily heat records has declined in most places during the past decades. But due to global warming, they have declined much less than the number of cold records, so that we now observe many more hot records than cold records. This shows how some aspects of extreme events can be increased by global warming at the same time as decreasing over time. A curve with no trend does not demonstrate that something is unaffected by global warming.

Drought

Drought is another area where it is very easy to over-interpret statistics with no significant change, as in this recent New York Times opinion piece on the serious drought in California. The argument here goes that man-made climate change has not played “any appreciable role in the current California drought”, because there is no trend in average precipitation. But that again is a rather weak argument, because drought is far more complex than just being driven by average precipitation. It has a lot to do with water stored in soils, which gets lost faster in a warmer climate due to higher evaporation rates. California has just had its warmest winter on record. And the Palmer Drought Severity Index, a standard measure for drought, does show a significant trend towards more serious drought conditions in California.

The cost of extreme weather events

If an increase in extreme weather events due to global warming is hard to prove by statistics amongst all the noise, how much harder is it to demonstrate an increase in damage cost due to global warming? Very much harder! A number of confounding socio-economic factors clouds this issue which are very hard to quantify and disentangle. Some factors act to increase the damage, like larger property values in harm’s way. Some act to decrease it, like more solid buildings (whether from better building codes or simply as a result of increased wealth) and better early warnings. Thus it is not surprising that the literature on this subject overall gives inconclusive results. Some studies find significant damage trends after adjusting for GDP, some don’t, tempting some pundits to play cite-what-I-like. The fact that the increase in damage cost is about as large as the increase in GDP (as recently argued at FiveThirtyEight) is certainly no strong evidence against an effect of global warming on damage cost. Like the stranger’s dozen rolls of dice in the pub, one simply cannot tell from these data.

The emphasis on questionable dollar-cost estimates distracts from the real issue of global warming’s impact on us. The European heat wave of 2003 may not have destroyed any buildings – but it is well documented that it caused about 70,000 fatalities. This is the type of event for which the probability has increased by a factor of five due to global warming – and is likely to rise to a factor twelve over the next thirty years. Poor countries, whose inhabitants hardly contribute to global greenhouse gas emissions, are struggling to recover from “natural” disasters, like Pakistan from the 2010 floods or the Philippines and Vietnam from tropical storm Haiyan last year. The families who lost their belongings and loved ones in such events hardly register in the global dollar-cost tally.

It’s physics, stupid!

While statistical studies on extremes are plagued by signal-to-noise issues and only give unequivocal results in a few cases with good data (like for temperature extremes), we have another, more useful source of information: physics. For example, basic physics means that rising temperatures will drive sea levels up, as is in fact observed. Higher sea level to start from will clearly make a storm surge (like that of the storms Sandy and Haiyan) run up higher. By adding 1+1 we therefore know that sea-level rise is increasing the damage from storm surges – probably decades before this can be statistically proven with observational data.

There are many more physical linkages like this – reviewed in our recent paper A decade of weather extremes. A warmer atmosphere can hold more moisture, for example, which raises the risk of extreme rainfall events and the flooding they cause. Warmer sea surface temperatures drive up evaporation rates and enhance the moisture supply to tropical storms. And the latent heat of water vapor is a prime source of energy for the atmosphere. Jerry Meehl from NCAR therefore compares the effect of adding greenhouse gases to putting the weather on steroids.

Yesterday the World Meteorological Organisation published its Annual Statement on the Climate, finding that “2013 once again demonstrated the dramatic impact of droughts, heat waves, floods and tropical cyclones on people and property in all parts of the planet” and that “many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change.”

With good physical reasons to expect the dice are loaded, we should not fool ourselves with reassuring-looking but uninformative statistics. Some statistics show significant changes – but many are simply too noisy to show anything. It would be foolish to just play on until the loading of the dice finally becomes evident even in highly noisy statistics. By then we will have paid a high price for our complacency.

 

Postscript (29 March):

The Huffington Post has the story of the letters that Roger Pielke sent to two leading climate scientists, perceived by them as threatening, after they criticised his article: FiveThirtyEight Apologizes On Behalf Of Controversial Climate Science Writer. According to the Huffington Post, Pielke wrote to Kevin Trenberth and his bosses:

Once again, I am formally asking you for a public correction and apology. If that is not forthcoming I will be pursuing this further. More generally, in the future how about we agree to disagree over scientific topics like gentlemen?

Pielke using the word “gentlemen” struck me as particularly ironic.

How gentlemanly is it that on his blog he falsely accused us of cherry-picking the last 100 years of data rather than using the full available 130 years in our PNAS paper Increase of extreme events in a warming world, even though we clearly say in the paper that our conclusion is based on the full data series?

How gentlemanly is it that he falsely claims “Rahmstorf confirms my critique (see the thread), namely, they used 1910-2009 trends as the basis for calculating 1880-2009 exceedence probabilities,” when I have done nothing of the sort?

How gentlemanly is it that to this day, in a second update to his original article, he claims on his website: “The RC11 methodology does not make any use of data prior to 1910 insofar as the results are concerned (despite suggestions to the contrary in the paper).” This is a very serious allegation for a scientist, namely that we mislead or deceive in our paper (some colleagues have interpreted this as an allegation of scientific fraud). This allegation is completely unsubstantiated by Pielke, and of course it is wrong.

We did not respond with a threatening letter – not our style. Rather, we published a simple statistics tutorial together with our data and computer code, hoping that in this way Pielke could understand and replicate our results. But until this day we have not received any apology for his false allegations.

Our paper showed that the climatic warming observed in Moscow particularly since 1980 greatly increased the chances of breaking the previous July temperature record (set in 1938) there. We concluded:

For July temperature in Moscow, we estimate that the local warming trend has increased the number of records expected in the past decade fivefold, which implies an approximate 80% probability that the 2010 July heat record would not have occurred without climate warming.

Pielke apparently did not understand why the temperatures before 1910 hardly affect this conclusion (in fact increasing the probability from 78% to 80%), and that the linear trend from 1880 or 1910 is not a useful predictor for this probability of breaking a record. This is why we decomposed the temperature data into a slow, non-linear trend line (shown here) and a stochastic component – a standard procedure that even makes it onto the cover picture of a data analysis textbook, as well as being described in a climate time series analysis textbook. (Pielke ridicules this method as “unconventional”.)

He gentlemanly writes about our paper:

That some climate scientists are playing games in their research, perhaps to get media attention in the larger battle over climate politics, is no longer a surprise. But when they use such games to try to discredit serious research, then the climate science community has a much, much deeper problem.

His praise of “serious research” by the way refers to a paper that claimed “a primarily natural cause for the Russian heat wave” and “that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.” (See also the graph above.)

 

Update (1 April):

Top hurricane expert Kerry Emanuel has now published a very good response to Pielke at FiveThirtyEight, making a number of the same points as I do above. He uses a better analogy than my dice example though, writing:

Suppose observations showed conclusively that the bear population in a particular forest had recently doubled. What would we think of someone who, knowing this, would nevertheless take no extra precautions in walking in the woods unless and until he saw a significant upward trend in the rate at which his neighbors were being mauled by bears?

The doubling of the bear population refers to the increase in hurricane power in the Atlantic which he showed in his Nature article of 2005 – an updated graph of his data is shown below, from our Nature Climate Change paper A decade of weather extremes.

Emanuel_Atlantic_PDI

 

Related posts:

Extremely hot

On record-breaking extremes

The Moscow warming hole

 


[i] For the math-minded: if a dice has a probability of 1/n to roll a six (a normal dice has n=6) and you roll it k times, the probability p to find m sixes is p = k!/[(k-m)!m!] × (n-1)(k-m)/nk.

]]>
http://www.realclimate.org/index.php/archives/2014/03/the-most-common-fallacy-in-discussing-extreme-weather-events/feed/langswitch_lang/en/ 59
How Many Cans? http://www.realclimate.org/index.php/archives/2014/03/how-many-cans/ http://www.realclimate.org/index.php/archives/2014/03/how-many-cans/#comments Sat, 22 Mar 2014 17:06:28 +0000 http://www.realclimate.org/?p=17081

XKCD, the brilliant and hilarious on-line comic, attempts to answer the question

How much CO2 is contained in the world’s stock of bottled fizzy drinks? How much soda would be needed to bring atmospheric CO2 back to preindustrial levels?

The answer is, enough to cover the Earth with 10 layers of soda cans. However, the comic misses a factor of about two, which would arise from the ocean. The oceans have been taking up carbon throughout the industrial era, as have some parts of the land surface biosphere. The ocean contains about half of the carbon we’ve ever released from fossil fuels. We’ve also cut down a lot of trees, which has been more-or-less compensated for by uptake into other parts of the land biosphere. So as a fraction of our total carbon footprint (fuels + trees) the oceans contain about a third.

At any rate, the oceans are acting as a CO2 buffer, meaning that it’s absorbing CO2 as it tries to limit the change to the atmospheric concentration. If we suddenly pulled atmospheric CO2 back down to 280 ppm (by putting it all in cans of soda perhaps), the oceans would work in the opposite direction, to buffer our present-day higher concentration by giving up CO2. The land biosphere is kind of a loose cannon in the carbon cycle, hard to predict what it will do.

Ten layers of soda cans covering the whole earth sounds like a lot. But most of a soda can is soda, rather than CO2. Here’s another statistic: If the CO2 in the atmosphere were to freeze out as dry ice depositing on the ground, the dry ice layer would only be about 7 millimeters thick. I guess cans of soda pop might not be the most efficient or economical means of CO2 sequestration. For a better option, look to saline aquifers, which are porous geological formations containing salty water that no one would want to drink or irrigate with anyway. CO2 at high pressure forms a liquid, then ultimately reacts with igneous rocks to form CaCO3.

Further Reading

Tans, Pieter. An accounting of the observed increase in oceanic and atmospheric CO2 and
an outlook for the Future. Oceanography 22(4) 26-35, 2009

Carbon dioxide capture and storage IPCC Report, 2005

]]>
http://www.realclimate.org/index.php/archives/2014/03/how-many-cans/feed/langswitch_lang/en/ 23
Can we make better graphs of global temperature history? http://www.realclimate.org/index.php/archives/2014/03/can-we-make-better-graphs-of-global-temperature-history/ http://www.realclimate.org/index.php/archives/2014/03/can-we-make-better-graphs-of-global-temperature-history/#comments Thu, 13 Mar 2014 13:57:56 +0000 http://www.realclimate.org/?p=17010

I’m writing this post to see if our audience can help out with a challenge: Can we collectively produce some coherent, properly referenced, open-source, scalable graphics of global temperature history that will be accessible and clear enough that we can effectively out-compete the myriad inaccurate and misleading pictures that continually do the rounds on social media?

Bad graphs

One of the most common fallacies in climate is the notion that, because the climate was hotter than now in the Eocene or Cretaceous or Devonian periods, we should have no concern for current global warming. Often this is combined with an implication that mainstream scientists are somehow unaware of these warmer periods (despite many of us having written multiple papers on previous warm climates). This is fallacious on multiple grounds, not least because everyone (including IPCC) has been discussing these periods for ages. Additionally, we know that sea levels during those peak warm periods were some 80 meters higher than today, and that impacts of the current global warming are going to be felt by societies and existing ecosystems that are adapted for Holocene climates – not climates 100 million years ago.

In making this point the most common graph that gets used is one originally put online by “Monte Hieb” on this website. Over the years, the graphic has changed slightly


Monte Hieb temperature/CO2 schematics

(versions courtesy of the wayback machine), but the essential points have remained the same. The ‘temperature’ record is a hand-drawn schematic derived from the work of Chris Scotese, and the CO2 graph is from a model that uses tectonic and chemical weathering histories to estimate CO2 levels (Berner 1994; Berner and Kothavala, 2001). In neither case is there an abundance of measured data.

The original Scotese renderings are also available (again, earlier versions via the wayback machine):


Scotese reconstructions

Scotese is an expert in reconstructions of continental positions through time and in creating his ‘temperature reconstruction’ he is basically following an old-fashioned idea (best exemplified by Frakes et al’s 1992 textbook) that the planet has two long-term stable equilibria (‘warm’ or ‘cool’) which it has oscillated between over geologic history. This kind of heuristic reconstruction comes from the qualitative geological record which gives indications of glaciations and hothouses, but is not really adequate for quantitative reconstructions of global mean temperatures. Over the last few decades, much better geochemical proxy compilations with better dating have appeared (for instance, Royer et al (2004)) and the idea that there are only two long-term climate states has long fallen by the wayside.

However, since this graphic has long been a favorite of the climate dismissives, many different versions do the rounds, mostly forwarded by people who have no idea of the provenance of the image or the lack of underlying data, or the updates that have occurred. Indeed, the 2004 version is the most common, having been given a boost by Monckton in 2008 and many others. Most recently, Patrick Moore declared that this was his favorite graph.

Better graphs

While more realistic graphs of temperature and CO2 histories will not prevent the basic fallacy we started discussing from being propagated, I think people should be encouraged to use actual data to make their points so that at least rebuttals of any logical fallacies wouldn’t have to waste time arguing about the underlying data. Plus it is so much better to have figures that don’t need a week to decipher (see some more principles at Betterfigures.org).

Some better examples of long term climate change graphics do exist. This one from Veizer et al (2000) for instance (as rendered by Robert Rohde):



Phanerozoic Climate Change

IPCC AR4 made a collation for the Cenozoic (65 Mya ago to present):



IPCC AR4 Fig 6.1

and some editors at Wikipedia have made an attempt to produce a complete record for the Phanerozoic:



Wikipedia multi-period collation

But these collations are imperfect in many ways. On the last figure the time axis is a rather confusing mix of linear segments and logarithmic scaling, there is no calibration during overlap periods, and the scaling and baselining of the individual, differently sourced data is a little ad hoc. Wikipedia has figures for other time periods that have not been updated in years and treatment of uncertainties is haphazard (many originally from GlobalWarmingArt).

I think this could all be done better. However, creating good graphics takes time and some skill, especially when the sources of data are so disparate. So this might be usefully done using some crowd-sourcing – where we collectively gather the data that we can find, process it so that we have clean data, discuss ways to fit it together, and try out different plotting styles. The goal would be to come up with a set of coherent up-to-date (and updatable) figures that could become a new standard for representing the temperature history of the planet. Thus…

The world temperature history challenge

The challenge comes in three parts:

  1. Finding suitable data
  2. Combining different data sets appropriately
  3. Graphically rendering the data

Each part requires work which could be spread widely across the participants. I have made a start on collating links to suitable data sets, and this can both be expanded upon and consolidated.

Period Reference Data download
0-600 Mya Veizer et al (2000), Royer et al (2004) (updated Royer (2014)) Veizer d180, Royer04 Temp, Royer14 CO2
0-65 Mya Zachos et al (2008), Hansen et al (2010) Zachos/Hansen
0-5.3 Mya Lisiecki and Raymo (2005) LR04 Stack
0-800 kya EPICA Dome C Temperature Reconstruction
0-125 kya NGRIP/Antarctic analog? NGRIP 50yr
0-12 kya Marcott et al (2013) MEA12 stack (xls)
0-2 kya Mann et al (2008), Ljungqvist (2010) MEA08 EIV, Ljungqvist10
1880-2013 CE GISTEMP GISTEMP LOTI
1850-2013 CE HadCRUT4 HadCRUT4 Global annual average, Cowtan&Way (infilled)
1850-2013 CE Berkeley Earth Land+Ocean annual mean

Combining this data is certainly a challenge, and there are multiple approaches that could be used that range from the very simple to the very complex. More subtly the uncertainties need to be properly combined also. Issues range from temporal and spatial coverage, time-dependent corrections in d18O for long term geologic processes or ice volume corrections, dating uncertainty etc.

Finally, rendering the graphics calls for additional skills – not least so that the different sources of data are clear, that the views over different timescales are coherent, and that the graphics are in the Wiki-standard SVG format (this site can be used for conversion from pdf or postscript).

Suggestions for other data sets to consider, issues of calibration and uncertainty and trial efforts are all welcome in the comments. If we make some collective progress, I’ll put up a new post describing the finished product(s). Who knows, you folks might even write a paper…

This post was inspired by a twitter conversation for Sou from Bundunga and some of the initial data links came via Robert Rohde (of Global Warming Art and now Berkeley Earth) and Dana Royer.

References

  1. R.A. Berner, "GEOCARB II; a revised model of atmospheric CO 2 over Phanerozoic time", American Journal of Science, vol. 294, pp. 56-91, 1994. http://dx.doi.org/10.2475/ajs.294.1.56
  2. R.A. Berner, "GEOCARB III: A revised model of atmospheric CO2 over Phanerozoic time", American Journal of Science, vol. 301, pp. 182-204, 2001. http://dx.doi.org/10.2475/ajs.301.2.182
  3. J. Veizer, Y. Godderis, and L.M. François, "", Nature, vol. 408, pp. 698-701, 2000. http://dx.doi.org/10.1038/35047044
  4. D. Royer, "Atmospheric CO2 and O2 During the Phanerozoic: Tools, Patterns, and Impacts", Treatise on Geochemistry, pp. 251-267, 2014. http://dx.doi.org/10.1016/B978-0-08-095975-7.01311-5
  5. L.E. Lisiecki, and M.E. Raymo, " A Pliocene-Pleistocene stack of 57 globally distributed benthic δ 18 O records ", Paleoceanography, vol. 20, pp. n/a-n/a, 2005. http://dx.doi.org/10.1029/2004PA001071
  6. S.A. Marcott, J.D. Shakun, P.U. Clark, and A.C. Mix, "A Reconstruction of Regional and Global Temperature for the Past 11,300 Years", Science, vol. 339, pp. 1198-1201, 2013. http://dx.doi.org/10.1126/science.1228026
  7. M.E. Mann, Z. Zhang, M.K. Hughes, R.S. Bradley, S.K. Miller, S. Rutherford, and F. Ni, "Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia", Proceedings of the National Academy of Sciences, vol. 105, pp. 13252-13257, 2008. http://dx.doi.org/10.1073/pnas.0805721105
  8. F.C. LJUNGQVIST, "A NEW RECONSTRUCTION OF TEMPERATURE VARIABILITY IN THE EXTRA-TROPICAL NORTHERN HEMISPHERE DURING THE LAST TWO MILLENNIA", Geografiska Annaler: Series A, Physical Geography, vol. 92, pp. 339-351, 2010. http://dx.doi.org/10.1111/j.1468-0459.2010.00399.x
]]>
http://www.realclimate.org/index.php/archives/2014/03/can-we-make-better-graphs-of-global-temperature-history/feed/langswitch_lang/en/ 121
The Nenana Ice Classic and climate http://www.realclimate.org/index.php/archives/2014/03/the-nenana-ice-classic-and-climate/ http://www.realclimate.org/index.php/archives/2014/03/the-nenana-ice-classic-and-climate/#comments Fri, 07 Mar 2014 15:51:05 +0000 http://www.realclimate.org/?p=16813

I am always interested in non-traditional data sets that can shed some light on climate changes. Ones that I’ve discussed previously are the frequency of closing of the Thames Barrier and the number of vineyards in England. With the exceptional warmth in Alaska last month (which of course was coupled with colder temperatures elsewhere), I was reminded of another one, the Nenana Ice Classic.

For those that don’t know what the ‘Classic’ is, it is lottery competition that has been running since 1917 to guess the date on which the Nenana river ice breaks up in the spring. The Nenana river is outside of Fairbanks, Alaska and can be relied on to freeze over every year. The locals put up a tripod on the ice, and when the ice breaks up in the spring, the tripod gets swept away. The closest estimate to the exact time this happens wins the lottery, which can have a quite substantial pot.

Due to the cold spring in Alaska last year, the ice break up date was the latest since 1917, consistent with the spring temperature anomaly state-wide being one of the coldest on record (unsurprisingly the Nenana break up date is quite closely correlated to spring Alaskan temperatures). This year is shaping up to be quite warm (though current temperatures in Nenana (as of March 7) are still quite brisk!).

Since there is now an almost century-long record of these break up dates, it makes sense to look at them as potential indicators of climate change (and interannual variability). The paper by Sagarin and Micheli (2001) was perhaps the first such study, and it has been alluded to many times since (for instance, in the Wall Street Journal and Physics Today in 2008).

The figure below shows the break up date in terms of days after a nominal March 21, or more precisely time from the vernal equinox (the small correction is so that the data don’t get confused by non-climatic calendar issues). The long term linear trend (which is negative and has a slope of roughly 6 days per century) indicates that on average the break up dates have been coming earlier in the season. This is clear despite a lot of year-to-year variability:



Figure: Break up dates at Nenena in Julian days (either from a nominal March 21 (JD-80), or specifically tied to the Vernal Equinox). Linear trend in the VE-corrected data is ~6 days/century (1917-2013, ±4 days/century, 95% range).

In the 2008 WSJ article Martin Jeffries, a local geophysicist, said:

The Nenana Ice Classic is a pretty good proxy for climate change in the 20th century.

And indeed it is. The break-up dates are strongly correlated to regional spring temperatures, which have warmed over the century, tracking the Nenana trend. But as with the cool weather January 2014 in parts of the lower 48, or warm weather in Europe or Alaska, the expected very large variability in winter weather can be relied on to produce outliers on a regular basis.

Given that year-to-year variability, it is predictable that whenever the annual result is above trend, it often gets cherry-picked to suggest that climate change is not happening (despite the persistent long-term trend). There are therefore no prizes for guessing which years’ results got a lot of attention from the ‘climate dismissives’*. This is the same phenomenon that happens every winter whenever there is some cold weather or snow somewhere. Indeed, it is so predictable** that it even gets its own xkcd cartoon:



(Climate data sourced from Climate Central).
The fact remains that winters have been getting warmer on average. While scientists are very interested in potential influences on the variability (whether from volcanoes, solar effects, greenhouse gases, or Arctic sea ice loss), it remains the case that this is much harder and more uncertain than attributing trends in the mean, which as should be clear, are much more robust. As yet there is no truly convincing evidence that any change in variance has been detected, though there are a lot of ideas out there, and some very interesting discussions (see for instance, Francis and Vavrus (2012) and Barnes (2013)).

For fun, I calculated some of the odds (Monte-Carlo simulations using observed mean, a distribution of trends based on the linear fit and the standard deviation of the residuals). This suggests that a date as late as May 20 (as in 2013) is very unexpected even without any climate trends (<0.7%) and even more so with (<0.2%), but that the odds of a date before April 29 have more than doubled (from 10% to 22%) with the trend. The most favored date is May 3rd (with no trend it would have been May 6th), but the odds of the break-up happening in that single 24 hour period are only around 1 in 14.

So, the Nenana ice Classic – unlike the other two examples I mentioned in the opening paragraph – does appear to be a useful climate metric. That isn’t to say that every year is going to follow the long-term trend (clearly it doesn’t), but you’d probably want to factor that in to (ever so slightly) improve your odds of winning.

* Yup. 2001, after 2008, 2011, and 2013.
** It is so predictable, I am thinking about opening a derivative market on whether this year’s Nenana result will get mentioned.

References

  1. R. Sagarin, "Climate Change in Nontraditional Data Sets", Science, vol. 294, pp. 811-811, 2001. http://dx.doi.org/10.1126/science.1064218
  2. J.A. Francis, and S.J. Vavrus, "Evidence linking Arctic amplification to extreme weather in mid-latitudes", Geophysical Research Letters, vol. 39, pp. n/a-n/a, 2012. http://dx.doi.org/10.1029/2012GL051000
  3. E.A. Barnes, "Revisiting the evidence linking Arctic Amplification 1 to extreme weather in midlatitudes", Geophysical Research Letters, pp. n/a-n/a, 2013. http://dx.doi.org/10.1002/grl.50880
]]>
http://www.realclimate.org/index.php/archives/2014/03/the-nenana-ice-classic-and-climate/feed/langswitch_lang/en/ 28