RealClimate logo


Comparing models to the satellite datasets

How should one make graphics that appropriately compare models and observations? There are basically two key points (explored in more depth here) – comparisons should be ‘like with like’, and different sources of uncertainty should be clear, whether uncertainties are related to ‘weather’ and/or structural uncertainty in either the observations or the models. There are unfortunately many graphics going around that fail to do this properly, and some prominent ones are associated with satellite temperatures made by John Christy. This post explains exactly why these graphs are misleading and how more honest presentations of the comparison allow for more informed discussions of why and how these records are changing and differ from models.
More »

Blizzard Jonas and the slowdown of the Gulf Stream System

Filed under: — stefan @ 24 January 2016

Blizzard Jonas on the US east coast has just shattered snowfall records. Both weather forecasters and climate experts have linked the high snowfall amounts to the exceptionally warm sea surface temperatures off the east coast. In this post I will examine a related question: why are sea surface temperatures so high there, as shown in the snapshot from Climate Reanalyzer below?

 

GFS-025deg_NH-SAT1_SST_anom_24_Jan_2016

I will argue that this warmth (as well as the cold blob in the subpolar Atlantic) is partly due to a slowdown of the Atlantic Meridional Overturning Circulation (AMOC), sometimes referred to as the Gulf Stream System, in response to global warming. There are two points to this argument:

More »

Marvel et al (2015) Part 1: Reconciling estimates of climate sensitivity

This post is related to the substantive results of the new Marvel et al (2015) study. There is a separate post on the media/blog response.

The recent paper by Kate Marvel and others (including me) in Nature Climate Change looks at the different forcings and their climate responses over the historical period in more detail than any previous modeling study. The point of the paper was to apply those results to improve calculations of climate sensitivity from the historical record and see if they can be reconciled with other estimates. But there are some broader issues as well – how scientific anomalies are dealt with and how simulation can be used to improve inferences about the real world. It also shines a spotlight on a particular feature of the IPCC process…

More »

References

  1. K. Marvel, G.A. Schmidt, R.L. Miller, and L.S. Nazarenko, "Implications for climate sensitivity from the response to individual forcings", Nature Climate Change, vol. 6, pp. 386-389, 2015. http://dx.doi.org/10.1038/nclimate2888

And the winner is…

Filed under: — group @ 17 November 2015

Remember the forecast of a temporary global cooling which made headlines around the world in 2008? We didn’t think it was reliable and offered a bet. The forecast period is now over: we were right, the forecast was not skillful.

Back around 2007/8, two high-profile papers claimed to produce, for the first time, skilful predictions of decadal climate change, based on new techniques of ocean state initialization in climate models. Both papers made forecasts of the future evolution of global mean and regional temperatures. The first paper, Smith et al. (2007), predicted “that internal variability will partially offset the anthropogenic global warming signal for the next few years. However, climate will continue to warm, with at least half of the years after 2009 predicted to exceed the warmest year currently on record.” The second, Keenlyside et al., (2008), forecast in contrast that “global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.”

This month marks the end of the forecast period for Keenlyside et al and so their forecasts can now be cleanly compared to what actually happened. This is particularly interesting to RealClimate, since we offered a bet to the authors on whether the results would be accurate based on our assessment of their methodology. They ignored our offer but now the time period of the bet has passed, it’s worth checking how it would have gone.

More »

References

  1. D.M. Smith, S. Cusack, A.W. Colman, C.K. Folland, G.R. Harris, and J.M. Murphy, "Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model", Science, vol. 317, pp. 796-799, 2007. http://dx.doi.org/10.1126/science.1139540
  2. N.S. Keenlyside, M. Latif, J. Jungclaus, L. Kornblueh, and E. Roeckner, "Advancing decadal-scale climate prediction in the North Atlantic sector", Nature, vol. 453, pp. 84-88, 2008. http://dx.doi.org/10.1038/nature06921

Debate in the noise

Last week there was an international media debate on climate data which appeared to be rather surreal to me. It was claimed that the global temperature data had so far shown a “hiatus” of global warming from 1998-2012, which was now suddenly gone after a data correction. So what happened?

One of the data centers that compile the data on global surface temperatures – NOAA – reported in the journal Science on an update of their data. Some artifacts due to changed measurement methods (especially for sea surface temperatures) were corrected and additional data of previously not included weather stations were added. All data centers are continually working to improve their database and they therefore occasionally present version updates of their global series (NASA data are currently at version 3, the British Hadley Centre data at version 4). There is nothing unusual about this, and the corrections are in the range of a few hundredths of a degree – see Figure 1. This really is just about fine details. More »


Switch to our mobile site