• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

RealClimate

Climate science from climate scientists...

  • Start here
  • Model-Observation Comparisons
  • Miscellaneous Climate Graphics
  • Surface temperature graphics
You are here: Home / Climate Science / Climate modelling / A reflection on reflection

A reflection on reflection

13 Apr 2026 by Gavin Leave a Comment

Confirmation bias and a profound lack of curiosity mark the latest ABC (Anything But Carbon) contrapalooza in DC this week and a decade-old albedo error trips them up.

I occasionally dip into the contrarian-sphere to see if there is anything new that might be of actual interest. I am usually disappointed, and last week’s escapade was no different. The quality of the talks was pretty abysmal – bad slides, monotone reading of notes, and abundant errors, misunderstandings, fallacies and cherry picks but, if there was a theme, it was that everything was so complicated and uncertain that no-one can know anything. This is a notable contrast to previous outings where everything was definitely due to the sun or ‘natural’ variability (anything but carbon remains the organizing principle).

Multiple speakers (including Willie Soon, John Clauser) purported to be very irate that the CERES Earth’s energy imbalance (EEI) record is calibrated to the changes in the in situ heat content data (dominated by the ocean heat content changes). Quite why they were so exercised was a little mysterious because their sources of information on this topic were the papers that clearly explained why and how this was being done (i.e. Loeb et al. (2009) or Loeb et al. (2018)). [Basically, the satellite data for the EEI does not have a good enough absolute calibration to be an independent estimate, and so the CERES EBAF product is adjusted to match the (much better characterized) in situ heat gain (Jul 2005-Jun 2015) in a way that does not affect the trends]. Also the EEI based on in situ data is apparently wrong because the AI told them so. Ok then.

In both Soon’s and Clauser’s talk, a particular figure made an appearance – Fig. 11a from Stephens et al. (2015).

Figure 1. From Stephens et al. (2015) showing the seasonal cycle of global albedo in the CMIP5 models (colored lines) and the CERES EBAF 2.7r data (black) (averaged over Mar 2000 to Mar 2013).

Unsurprisingly, this was used to claim that the CMIP5 models (and, by implication, all models) were terribly wrong, can’t be trusted etc. etc. Oddly, neither of them chose to show the comparison with the later CMIP6 models (Jian et al., 2020):

Figure 2. Comparison of the CMIP6 Multi-model ensemble mean (MEM) planetary albedo (PA) (yellow bars) with the CERES EBAF v4.1 data (blue bars) (averaged over 2001-14). Hemispheric comparisons in red (NH) and green (SH). From Fig. 4c from Jian et al. (2020).

Or even the earlier CMIP3 models from Bender et al (2006):

Figure 3. Comparison of the ERBE and early CERES data (Mar 2000 – Dec 2003) to the CMIP3 multi-model ensemble mean (from Fig 1a in Bender et al. (2006)).

Well, it’s not so odd, since these comparisons are much more favorable to the models. But lets look closer…

The CERES observations in the three plots do not agree at all! The 2015 figure has maximum albedo in March and October, while the other two have maxima in June and December – a 2 or 3 month phase shift. Something is wrong here. Fortunately, the CERES project has a very accessible website for downloading data, and it’s trivial to get the incoming solar flux and reflected solar flux for every month. The albedo is just the ratio, and we can average the months to create a climatology. The differences in the averaging periods makes no visible difference, and the differences in the EBAF version are likely to be minor (though that is harder to check). However, the bottom line is that the CERES data in the 2015 figure is wrong, while the 2006 and 2020 papers are correct.

Figure 4. The 2015 figure with an overlay of my replication of CERES global albedo (red).

We can speculate about what led to this (possibly related to the first month with data being March 2000 assumed to be January?), but there are two immediate consequences. First, the CMIP5 models (like the CMIP6 and CMIP3 models) turn out not to be so bad: phasing is ok, but the annual mean albedo can be a little variable. Second, it’s likely that the other panels in Fig 11, Figs. 5a-c, the discussion about them in section 6 etc. in the Stephens et al (2015) are also affected by this. Despite citing Bender et al (2006), and also Kato (2009) (see his figure 1a) who have it correct, the phase offset was not addressed. The Stephens et al paper has since been cited over 240 times, and it seems odd that no-one else had noticed this issue [Aside, if you know of a reference that does make this point, please let me know in the comments].

Why now?

Interest in the EEI is obviously growing, both as a function of the increasing length of the CERES timeseries and the fact that the EEI is growing. Even the WMO is elevating this metric in importance. So one might expect the contrarian-sphere to try and undermine it – that’s just what they do.

Figure 5. CERES EBAF v4.2 data showing an increase in the EEI over 25 years. Note that the absolute EEI is calibrated to the estimated in situ heat content rise but the trends are independent of that.

But here is the difference between doing real science and what is on show at the DC contrapalooza. Scientists are curious about what is actually going on. Given a discrepancy, they want to understand what’s happening. The changes in albedo over the CERES record are indeed interesting and a little challenging to explain (the CERESMIP project is looking into this in more detail), but the scientists’ goal is to dig deeper until it becomes clear. For Soon and Clauser, discrepancies are just weapons – they don’t care that something doesn’t look right – in fact they want it to look wrong regardless of whether it’s an error in an old paper, or an ambiguous statement that they can read uncharitably, or a genuine issue. Thus the chances of them checking into this themselves is zero – despite their frequent claims that they want to ‘follow the data’.

Do I expect everyone to check every figure in every paper they cite before using them in a presentation? No. But this example outlines how important open science is. When something comes up like this, people should be able to check quickly that the label and the contents match. It also highlights the danger of leaving issues uncorrected in the literature. I don’t know if this issue has been brought to the attention of the journal or the authors already, but even papers from a decade ago get cited and used (see here for another example). We owe it to everyone (yes, even the contrarians!) to make sure that the literature is as free of error as we can make it.

References

  1. N.G. Loeb, B.A. Wielicki, D.R. Doelling, G.L. Smith, D.F. Keyes, S. Kato, N. Manalo-Smith, and T. Wong, "Toward Optimal Closure of the Earth's Top-of-Atmosphere Radiation Budget", Journal of Climate, vol. 22, pp. 748-766, 2009. http://dx.doi.org/10.1175/2008JCLI2637.1
  2. N.G. Loeb, D.R. Doelling, H. Wang, W. Su, C. Nguyen, J.G. Corbett, L. Liang, C. Mitrescu, F.G. Rose, and S. Kato, "Clouds and the Earth’s Radiant Energy System (CERES) Energy Balanced and Filled (EBAF) Top-of-Atmosphere (TOA) Edition-4.0 Data Product", Journal of Climate, vol. 31, pp. 895-918, 2018. http://dx.doi.org/10.1175/JCLI-D-17-0208.1
  3. G.L. Stephens, D. O'Brien, P.J. Webster, P. Pilewski, S. Kato, and J. Li, "The albedo of Earth", Reviews of Geophysics, vol. 53, pp. 141-163, 2015. http://dx.doi.org/10.1002/2014RG000449
  4. B. Jian, J. Li, Y. Zhao, Y. He, J. Wang, and J. Huang, "Evaluation of the CMIP6 planetary albedo climatology using satellite observations", Climate Dynamics, vol. 54, pp. 5145-5161, 2020. http://dx.doi.org/10.1007/s00382-020-05277-4
  5. F.A. Bender, H. Rodhe, R.J. Charlson, A.M.L. Ekman, and N. Loeb, "22 views of the global albedo—comparison between 20 GCMs and two satellites", Tellus A: Dynamic Meteorology and Oceanography, vol. 58, pp. 320, 2006. http://dx.doi.org/10.1111/j.1600-0870.2006.00181.x
  6. S. Kato, "Interannual Variability of the Global Radiation Budget", Journal of Climate, vol. 22, pp. 4893-4907, 2009. http://dx.doi.org/10.1175/2009JCLI2795.1

Filed Under: Climate modelling, Climate Science, Featured Story, Instrumental Record, Scientific practice, skeptics Tagged With: albedo, CMIP5, CMIP6, Earth's Energy Imbalance, Heartland Institute, John Clauser, Willie Soon

About Gavin

Reader Interactions

Comment Policy:Please note that if your comment repeats a point you have already made, or is abusive, or is the nth comment you have posted in a very short amount of time, please reflect on the whether you are using your time online to maximum efficiency. Thanks.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Search

Search for:

Email Notification

get new posts sent to you automatically (free)
Loading

Recent Posts

  • A reflection on reflection
  • Spencer’s Shenanigans: Part II
  • The Puzzling Pleistocene
  • How robust is our accelerometer?
  • Unforced Variations: Mar 2026
  • EPA’s final* ruling on CO2

Our Books

Book covers
This list of books since 2005 (in reverse chronological order) that we have been involved in, accompanied by the publisher’s official description, and some comments of independent reviewers of the work.
All Books >>

Recent Comments

  • patrick o twentyseven on Unforced Variations: Mar 2026
  • Ron R. on Unforced Variations: Mar 2026
  • MA Rodger on Unforced Variations: Mar 2026
  • Ray Ladbury on Unforced Variations: Mar 2026
  • Bill Henderson on Unforced Variations: Mar 2026
  • Paul Pukite (@whut) on Spencer’s Shenanigans: Part II
  • Ron R. on Unforced Variations: Mar 2026
  • jgnfld on Spencer’s Shenanigans: Part II
  • Dean Rovang on Spencer’s Shenanigans: Part II
  • Ray Ladbury on Unforced Variations: Mar 2026
  • Annette on Unforced Variations: Mar 2026
  • zebra on Spencer’s Shenanigans: Part II
  • Barton Paul Levenson on Spencer’s Shenanigans: Part II
  • Pete Best on Unforced Variations: Mar 2026
  • Tomáš Kalisz on The Puzzling Pleistocene
  • Bill Henderson on Unforced Variations: Mar 2026
  • Ray Ladbury on Spencer’s Shenanigans: Part II
  • Ray Ladbury on Spencer’s Shenanigans: Part II
  • Keith Woollard on Spencer’s Shenanigans: Part II
  • Data on The Puzzling Pleistocene
  • John Pollack on Spencer’s Shenanigans: Part II
  • Data on Unforced Variations: Mar 2026
  • Piotr on The Puzzling Pleistocene
  • JCM on Unforced Variations: Mar 2026
  • Radge Havers on Spencer’s Shenanigans: Part II
  • Ken Towe on Spencer’s Shenanigans: Part II
  • zebra on Spencer’s Shenanigans: Part II
  • Ray Ladbury on Spencer’s Shenanigans: Part II
  • jgnfld on Spencer’s Shenanigans: Part II
  • Keith Woollard on Spencer’s Shenanigans: Part II

Footer

ABOUT

  • About
  • Translations
  • Privacy Policy
  • Contact Page
  • Login

DATA AND GRAPHICS

  • Data Sources
  • Model-Observation Comparisons
  • Surface temperature graphics
  • Miscellaneous Climate Graphics

INDEX

  • Acronym index
  • Index
  • Archives
  • Contributors

Realclimate Stats

1,402 posts

15 pages

251,077 comments

Copyright © 2026 · RealClimate is a commentary site on climate science by working climate scientists for the interested public and journalists.