RealClimate logo

Comparing models to the satellite datasets

How should one make graphics that appropriately compare models and observations? There are basically two key points (explored in more depth here) – comparisons should be ‘like with like’, and different sources of uncertainty should be clear, whether uncertainties are related to ‘weather’ and/or structural uncertainty in either the observations or the models. There are unfortunately many graphics going around that fail to do this properly, and some prominent ones are associated with satellite temperatures made by John Christy. This post explains exactly why these graphs are misleading and how more honest presentations of the comparison allow for more informed discussions of why and how these records are changing and differ from models.
More »

Blizzard Jonas and the slowdown of the Gulf Stream System

Filed under: — stefan @ 24 January 2016

Blizzard Jonas on the US east coast has just shattered snowfall records. Both weather forecasters and climate experts have linked the high snowfall amounts to the exceptionally warm sea surface temperatures off the east coast. In this post I will examine a related question: why are sea surface temperatures so high there, as shown in the snapshot from Climate Reanalyzer below?



I will argue that this warmth (as well as the cold blob in the subpolar Atlantic) is partly due to a slowdown of the Atlantic Meridional Overturning Circulation (AMOC), sometimes referred to as the Gulf Stream System, in response to global warming. There are two points to this argument:

More »

Marvel et al (2015) Part 1: Reconciling estimates of climate sensitivity

This post is related to the substantive results of the new Marvel et al (2015) study. There is a separate post on the media/blog response.

The recent paper by Kate Marvel and others (including me) in Nature Climate Change looks at the different forcings and their climate responses over the historical period in more detail than any previous modeling study. The point of the paper was to apply those results to improve calculations of climate sensitivity from the historical record and see if they can be reconciled with other estimates. But there are some broader issues as well – how scientific anomalies are dealt with and how simulation can be used to improve inferences about the real world. It also shines a spotlight on a particular feature of the IPCC process…

More »


  1. K. Marvel, G.A. Schmidt, R.L. Miller, and L.S. Nazarenko, "Implications for climate sensitivity from the response to individual forcings", Nature Climate Change, vol. 6, pp. 386-389, 2015.

And the winner is…

Filed under: — group @ 17 November 2015

Remember the forecast of a temporary global cooling which made headlines around the world in 2008? We didn’t think it was reliable and offered a bet. The forecast period is now over: we were right, the forecast was not skillful.

Back around 2007/8, two high-profile papers claimed to produce, for the first time, skilful predictions of decadal climate change, based on new techniques of ocean state initialization in climate models. Both papers made forecasts of the future evolution of global mean and regional temperatures. The first paper, Smith et al. (2007), predicted “that internal variability will partially offset the anthropogenic global warming signal for the next few years. However, climate will continue to warm, with at least half of the years after 2009 predicted to exceed the warmest year currently on record.” The second, Keenlyside et al., (2008), forecast in contrast that “global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.”

This month marks the end of the forecast period for Keenlyside et al and so their forecasts can now be cleanly compared to what actually happened. This is particularly interesting to RealClimate, since we offered a bet to the authors on whether the results would be accurate based on our assessment of their methodology. They ignored our offer but now the time period of the bet has passed, it’s worth checking how it would have gone.

More »


  1. D.M. Smith, S. Cusack, A.W. Colman, C.K. Folland, G.R. Harris, and J.M. Murphy, "Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model", Science, vol. 317, pp. 796-799, 2007.
  2. N.S. Keenlyside, M. Latif, J. Jungclaus, L. Kornblueh, and E. Roeckner, "Advancing decadal-scale climate prediction in the North Atlantic sector", Nature, vol. 453, pp. 84-88, 2008.

Debate in the noise

Last week there was an international media debate on climate data which appeared to be rather surreal to me. It was claimed that the global temperature data had so far shown a “hiatus” of global warming from 1998-2012, which was now suddenly gone after a data correction. So what happened?

One of the data centers that compile the data on global surface temperatures – NOAA – reported in the journal Science on an update of their data. Some artifacts due to changed measurement methods (especially for sea surface temperatures) were corrected and additional data of previously not included weather stations were added. All data centers are continually working to improve their database and they therefore occasionally present version updates of their global series (NASA data are currently at version 3, the British Hadley Centre data at version 4). There is nothing unusual about this, and the corrections are in the range of a few hundredths of a degree – see Figure 1. This really is just about fine details. More »

Global warming and unforced variability: Clarifications on recent Duke study

Filed under: — group @ 13 May 2015

Guest Commentary from Patrick Brown and Wenhong Li, Duke University

We recently published a study in Scientific Reports titled Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. Our study seemed to generated a lot of interest and we have received many inquires regarding its findings. We were pleased with some of coverage of our study (e.g., here) but we were disappointed that some outlets published particularly misleading articles (e.g, here, here, and here). Since there appears to be some confusion regarding our study’s findings, we would like to clarify some points (see also MM4A’s discussion).

More »


  1. P.T. Brown, W. Li, E.C. Cordero, and S.A. Mauget, "Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise", Scientific Reports, vol. 5, 2015.

Reflections on Ringberg

As previewed last weekend, I spent most of last week at a workshop on Climate Sensitivity hosted by the Max Planck Institute at Schloss Ringberg. It was undoubtedly one of the better workshops I’ve attended – it was focussed, deep and with much new information to digest (some feel for the discussion can be seen from the #ringberg15 tweets). I’ll give a brief overview of my impressions below.

More »

Climate Sensitivity Week

Some of you will be aware that there is a workshop on Climate Sensitivity this week at Schloss Ringberg in southern Germany. The topics to be covered include how sensitivity is defined (and whether it is even meaningful (Spoiler, yes it is)), what it means, how it can be constrained, what the different flavours signify etc. There is an impressive list of attendees with a very diverse range of views on just about everything, and so I am looking forward to very stimulating discussions.

More »

Unforced variations: Nov 2014

Filed under: — group @ 2 November 2014

This month’s open thread. In honour of today’s New York Marathon, we are expecting the fastest of you to read and digest the final IPCC Synthesis report in sub-3 hours. For those who didn’t keep up with the IPCC training regime, the Summary for Policy Makers provides a more accessible target.

Also in the news, follow #ArcticCircle2014 for some great info on the Arctic Circle meeting in Iceland.

Ocean heat storage: a particularly lousy policy target + Update

Filed under: — stefan @ 20 October 2014

The New York Times, 12 December 2027: After 12 years of debate and negotiation, kicked off in Paris in 2015, world leaders have finally agreed to ditch the goal of limiting global warming to below 2 °C. Instead, they have agreed to the new goal of limiting global ocean heat content to 1024 Joules. The decision was widely welcomed by the science and policy communities as a great step forward. “In the past, the 2 °C goal has allowed some governments to pretend that they are taking serious action to mitigate global warming, when in reality they have achieved almost nothing. I’m sure that this can’t happen again with the new 1024 Joules goal”, said David Victor, a professor of international relations who originally proposed this change back in 2014. And an unnamed senior EU negotiator commented: “Perhaps I shouldn’t say this, but some heads of state had trouble understanding the implications of the 2 °C target; sometimes they even accidentally talked of limiting global warming to 2%. I’m glad that we now have those 1024 Joules which are much easier to grasp for policy makers and the public.”

This fictitious newspaper item is of course absurd and will never become reality, because ocean heat content is unsuited as a climate policy target. Here are three main reasons why. More »