• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

RealClimate

Climate science from climate scientists...

  • Start here
  • Model-Observation Comparisons
  • Miscellaneous Climate Graphics
  • Surface temperature graphics
You are here: Home / Archives for Climate Science / Instrumental Record

Instrumental Record

A new sea level curve

14 Jan 2015 by Stefan

The “zoo” of global sea level curves calculated from tide gauge data has grown – tomorrow a new reconstruction of our US colleagues around Carling Hay from Harvard University will appear in Nature (Hay et al. 2015). That is a good opportunity for an overview over the available data curves. The differences are really in the details, the “big picture” of sea-level rise does not change. In all curves, the current rates of rise are the highest since records began.

The following graph shows the new sea level curve as compared to six known ones.

haysl1

Fig 1 Sea level curves calculated by different research groups with various methods. The curves show the sea level relative to the satellite era (since 1992). Graph: Klaus Bittermann.

All curves show the well-known modern sea level rise, but the exact extent and time evolution of the rise differ somewhat. Up to about 1970, the new reconstruction of Hay et al. runs at the top of the existing uncertainty range. For the period from 1880 AD, however, it shows the same total increase as the current favorites by Church & White. Starting from 1900 AD it is about 25 mm less. This difference is at the margins of significance: the uncertainty ranges overlap. [Read more…] about A new sea level curve

References

  1. C.C. Hay, E. Morrow, R.E. Kopp, and J.X. Mitrovica, "Probabilistic reanalysis of twentieth-century sea-level rise", Nature, vol. 517, pp. 481-484, 2015. http://dx.doi.org/10.1038/nature14093

Filed Under: Climate Science, Instrumental Record, Oceans

Absolute temperatures and relative anomalies

23 Dec 2014 by Gavin

Most of the images showing the transient changes in global mean temperatures (GMT) over the 20th Century and projections out to the 21st C, show temperature anomalies. An anomaly is the change in temperature relative to a baseline which usually the pre-industrial period, or a more recent climatology (1951-1980, or 1980-1999 etc.). With very few exceptions the changes are almost never shown in terms of absolute temperatures. So why is that?

[Read more…] about Absolute temperatures and relative anomalies

Filed Under: Climate modelling, Climate Science, Instrumental Record

The most popular deceptive climate graph

8 Dec 2014 by Stefan

The “World Climate Widget” from Tony Watts’ blog is probably the most popular deceptive image among climate “skeptics”.  We’ll take it under the microscope and show what it would look like when done properly.

So called “climate skeptics” deploy an arsenal of misleading graphics, with which the human influence on the climate can be down played (here are two other  examples deconstructed at Realclimate).  The image below is especially widespread.  It is displayed on many “climate skeptic” websites and is regularly updated.

Watts_world_climate_widget

The “World Climate Widget” of US “climate skeptic” Anthony Watts with our explanations added.  The original can be found on Watts’ blog

What would a more honest display of temperature, CO2 and sunspots look like? [Read more…] about The most popular deceptive climate graph

Filed Under: Climate Science, Communicating Climate, Instrumental Record, skeptics, Sun-earth connections

Recent global warming trends: significant or paused or what?

4 Dec 2014 by Stefan

As the World Meteorological Organisation WMO has just announced that “The year 2014 is on track to be the warmest, or one of the warmest years on record”, it is timely to have a look at recent global temperature changes.

I’m going to use Kevin Cowtan’s nice interactive temperature plotting and trend calculation tool to provide some illustrations. I will be using the HadCRUT4 hybrid data, which have the most sophisticated method to fill data gaps in the Arctic with the help of satellites, but the same basic points can be illustrated with other data just as well.

Let’s start by looking at the full record, which starts in 1979 since the satellites come online there (and it’s not long after global warming really took off).

trend1Fig. 1. Global temperature 1979 to present – monthly values (crosses), 12-months running mean (red line) and linear trend line with uncertainty (blue) [Read more…] about Recent global warming trends: significant or paused or what?

Filed Under: Climate Science, Communicating Climate, Instrumental Record, Reporting on climate, skeptics

Ocean heat storage: a particularly lousy policy target + Update

20 Oct 2014 by Stefan

The New York Times, 12 December 2027: After 12 years of debate and negotiation, kicked off in Paris in 2015, world leaders have finally agreed to ditch the goal of limiting global warming to below 2 °C. Instead, they have agreed to the new goal of limiting global ocean heat content to 1024 Joules. The decision was widely welcomed by the science and policy communities as a great step forward. “In the past, the 2 °C goal has allowed some governments to pretend that they are taking serious action to mitigate global warming, when in reality they have achieved almost nothing. I’m sure that this can’t happen again with the new 1024 Joules goal”, said David Victor, a professor of international relations who originally proposed this change back in 2014. And an unnamed senior EU negotiator commented: “Perhaps I shouldn’t say this, but some heads of state had trouble understanding the implications of the 2 °C target; sometimes they even accidentally talked of limiting global warming to 2%. I’m glad that we now have those 1024 Joules which are much easier to grasp for policy makers and the public.”

This fictitious newspaper item is of course absurd and will never become reality, because ocean heat content is unsuited as a climate policy target. Here are three main reasons why. [Read more…] about Ocean heat storage: a particularly lousy policy target + Update

Filed Under: Climate Science, Instrumental Record, IPCC, Oceans

Climate response estimates from Lewis & Curry

6 Oct 2014 by group

Guest commentary from Richard Millar (U. Oxford)

The recent Lewis and Curry study of climate sensitivity estimated from the transient surface temperature record is being lauded as something of a game-changer – but how much of a game-changer is it really?

[Read more…] about Climate response estimates from Lewis & Curry

References

  1. N. Lewis, and J.A. Curry, "The implications for climate sensitivity of AR5 forcing and heat uptake estimates", Climate Dynamics, vol. 45, pp. 1009-1023, 2014. http://dx.doi.org/10.1007/s00382-014-2342-y

Filed Under: Climate modelling, Climate Science, Greenhouse gases, Instrumental Record, IPCC

Limiting global warming to 2 °C – why Victor and Kennel are wrong + update

1 Oct 2014 by Stefan

In a comment in Nature titled Ditch the 2 °C warming goal, political scientist David Victor and retired astrophysicist Charles Kennel advocate just that. But their arguments don’t hold water.

It is clear that the opinion article by Victor & Kennel is meant to be provocative. But even when making allowances for that, the arguments which they present are ill-informed and simply not supported by the facts. The case for limiting global warming to at most 2°C above preindustrial temperatures remains very strong.

Let’s start with an argument that they apparently consider especially important, given that they devote a whole section and a graph to it. They claim:

The scientific basis for the 2 °C goal is tenuous. The planet’s average temperature has barely risen in the past 16 years. [Read more…] about Limiting global warming to 2 °C – why Victor and Kennel are wrong + update

Filed Under: Climate impacts, Climate Science, Instrumental Record, IPCC

IPCC attribution statements redux: A response to Judith Curry

27 Aug 2014 by Gavin

I have written a number of times about the procedure used to attribute recent climate change (here in 2010, in 2012 (about the AR4 statement), and again in 2013 after AR5 was released). For people who want a summary of what the attribution problem is, how we think about the human contributions and why the IPCC reaches the conclusions it does, read those posts instead of this one.

The bottom line is that multiple studies indicate with very strong confidence that human activity is the dominant component in the warming of the last 50 to 60 years, and that our best estimates are that pretty much all of the rise is anthropogenic.



The probability density function for the fraction of warming attributable to human activity (derived from Fig. 10.5 in IPCC AR5). The bulk of the probability is far to the right of the “50%” line, and the peak is around 110%.

If you are still here, I should be clear that this post is focused on a specific claim Judith Curry has recently blogged about supporting a “50-50” attribution (i.e. that trends since the middle of the 20th Century are 50% human-caused, and 50% natural, a position that would center her pdf at 0.5 in the figure above). She also commented about her puzzlement about why other scientists don’t agree with her. Reading over her arguments in detail, I find very little to recommend them, and perhaps the reasoning for this will be interesting for readers. So, here follows a line-by-line commentary on her recent post. Please excuse the length.
[Read more…] about IPCC attribution statements redux: A response to Judith Curry

Filed Under: Climate modelling, Climate Science, Instrumental Record, IPCC

Rossby waves and surface weather extremes

10 Jul 2014 by Stefan

A new study by Screen and Simmonds demonstrates the statistical connection between high-amplitude planetary waves in the atmosphere and extreme weather events on the ground.

Guest post by Dim Coumou

There has been an ongoing debate, both in and outside the scientific community, whether rapid climate change in the Arctic might affect circulation patterns in the mid-latitudes, and thereby possibly the frequency or intensity of extreme weather events. The Arctic has been warming much faster than the rest of the globe (about twice the rate), associated with a rapid decline in sea-ice extent. If parts of the world warm faster than others then of course gradients in the horizontal temperature distribution will change – in this case the equator-to-pole gradient – which then could affect large scale wind patterns.

Several dynamical mechanisms for this have been proposed recently. Francis and Vavrus (GRL 2012) argued that a reduction of the north-south temperature gradient would cause weaker zonal winds (winds blowing west to east) and therefore a slower eastward propagation of Rossby waves. A change in Rossby wave propagation has not yet been detected (Barnes 2013) but this does not mean that it will not change in the future. Slowly-traveling waves (or quasi-stationary waves) would lead to more persistent and therefore more extreme weather. Petoukhov et al (2013) actually showed that several recent high-impact extremes, both heat waves and flooding events, were associated with high-amplitude quasi-stationary waves. [Read more…] about Rossby waves and surface weather extremes

References

  1. J.A. Francis, and S.J. Vavrus, "Evidence linking Arctic amplification to extreme weather in mid‐latitudes", Geophysical Research Letters, vol. 39, 2012. http://dx.doi.org/10.1029/2012GL051000
  2. E.A. Barnes, "Revisiting the evidence linking Arctic amplification to extreme weather in midlatitudes", Geophysical Research Letters, vol. 40, pp. 4734-4739, 2013. http://dx.doi.org/10.1002/grl.50880
  3. V. Petoukhov, S. Rahmstorf, S. Petri, and H.J. Schellnhuber, "Quasiresonant amplification of planetary waves and recent Northern Hemisphere weather extremes", Proceedings of the National Academy of Sciences, vol. 110, pp. 5336-5341, 2013. http://dx.doi.org/10.1073/pnas.1222000110

Filed Under: Arctic and Antarctic, Climate Science, Instrumental Record, statistics

Release of the International Surface Temperature Initiative’s (ISTI’s) Global Land Surface Databank, an expanded set of fundamental surface temperature records

6 Jul 2014 by rasmus

Guest post by Jared Rennie, Cooperative Institute for Climate and Satellites, North Carolina on behalf of the databank working group of the International Surface Temperature Initiative

In the 21st Century, when multi-billion dollar decisions are being made to mitigate and adapt to climate change, society rightly expects openness and transparency in climate science to enable a greater understanding of how climate has changed and how it will continue to change. Arguably the very foundation of our understanding is the observational record. Today a new set of fundamental holdings of land surface air temperature records stretching back deep into the 19th Century has been released as a result of several years of effort by a multinational group of scientists.

The International Surface Temperature Initiative (ISTI) was launched by an international and multi-disciplinary group of scientists in 2010 to improve understanding of the Earth’s climate from the global to local scale. The Databank Working Group, under the leadership of NOAA’s National Climatic Data Center (NCDC), has produced an innovative data holding that largely leverages off existing data sources, but also incorporates many previously unavailable sources of surface air temperature. This data holding provides users a way to better track the origin of the data from its collection through its integration. By providing the data in various stages that lead to the integrated product, by including data origin tracking flags with information on each observation, and by providing the software used to process all observations, the processes involved in creating the observed fundamental climate record are completely open and transparent to the extent humanly possible.

Databank Architecture

figure1

The databank includes six data Stages, starting from the original observation to the final quality controlled and bias corrected product (Figure 1). The databank begins at Stage Zero holdings, which contain scanned images of digital observations in their original form. These images are hosted on the databank server when third party hosting is not possible. Stage One contains digitized data, in its native format, provided by the contributor. No effort is required on their part to convert the data into any other format. This reduces the possibility that errors could occur during translation. We collated over 50 sources ranging from single station records to holdings of several tens of thousands of stations.

Once data are submitted as Stage One, all data are converted into a common Stage Two format. In addition, data provenance flags are added to every observation to provide a history of that particular observation. Stage Two files are maintained in ASCII format, and the code to convert all the sources is provided. After collection and conversion to a common format, the data are then merged into a single, comprehensive Stage Three dataset. The algorithm that performs the merging is described below. Development of the merged dataset is followed by quality control and homogeneity adjustments (Stage Four and Five, respectively). These last two stages are not the responsibility of Databank Working Group, see the discussion of broader context below.

Merge Algorithm Description

The following is an overview of the process in which individual Stage Two sources are combined to form a comprehensive Stage Three dataset. A more detailed description can be found in a manuscript accepted and published by Geoscience Data Journal (Rennie et al., 2014).

The algorithm attempts to mimic the decisions an expert analyst would make manually. Given the fractured nature of historical data stewardship many sources will inevitably contain records for the same station and it is necessary to create a process for identifying and removing duplicate stations, merging some sources to produce a longer station record, and in other cases determining when a station should be brought in as a new distinct record.

The merge process is accomplished in an iterative fashion, starting from the highest priority data source (target) and running progressively through the other sources (candidates). A source hierarchy has been established which prioritizes datasets that have better data provenance, extensive metadata, and long, consistent periods of record. In addition it prioritizes holdings derived from daily data to allow consistency between daily holdings and monthly holdings. Every candidate station read in is compared to all target stations, and one of three possible decisions is made. First, when a station match is found, the candidate station is merged with the target station. Second, if the candidate station is determined to be unique it is added to the target dataset as a new station. Third, the available information is insufficient, conflicting, or ambiguous, and the candidate station is withheld.

Stations are first compared through their metadata to identify matching stations. Four tests are applied: geographic distance, height distance, station name similarity, and when the data record began. Non-missing metrics are then combined to create a metadata metric and it is determined whether to move on to data comparisons, or to withhold the candidate station. If a data comparison is deemed necessary, overlapping data between the target and candidate station is tested for goodness-of-fit using the Index of Agreement (IA). At least five years of overlap are required for a comparison to be made. A lookup table is used to provide two data metrics, the probability of station match (H1) and the probability of station uniqueness (H2). These are then combined with the metadata metric to create posterior metrics of station match and uniqueness. These are used to determine if the station is merged, added as unique, or withheld.

Stage Three Dataset Description

figure2

The integrated data holding recommended and endorsed by ISTI contains over 32,000 global stations (Figure 2), over four times as many stations as GHCN-M version 3. Although station coverage varies spatially and temporally, there are adequate stations with decadal and century periods of record at local, regional, and global scales. Since 1850, there consistently are more stations in the recommended merge than GHCN-M (Figure 3). In GHCN-M version 3, there was a significant drop in stations in 1990 reflecting the dependency on the decadal World Weather Records collection as a source, which is ameliorated by many of the new sources which can be updated much more rapidly and will enable better real-time monitoring.

figure3

Many thresholds are used in the merge and can be set by the user before running the merge program. Changing these thresholds can significantly alter the overall result of the program. Changes will also occur when the source priority hierarchy is altered. In order to characterize the uncertainty associated with the merge parameters, seven different variants of the Stage Three product were developed alongside the recommended merge. This uncertainty reflects the importance of data rescue. While a major effort has been undertaken through this initiative, more can be done to include areas that are lacking on both spatial and temporal scales, or lacking maximum and minimum temperature data.

Data Access

Version 1.0.0 of the Global Land Surface Databank has been released and data are provided from a primary ftp site hosted by the Global Observing Systems Information Center (GOSIC) and World Data Center A at NOAA NCDC. The Stage Three dataset has multiple formats, including a format approved by ISTI, a format similar to GHCN-M, and netCDF files adhering to the Climate and Forecast (CF) convention. The data holding is version controlled and will be updated frequently in response to newly discovered data sources and user comments.

All processing code is provided, for openness and transparency. Users are encouraged to experiment with the techniques used in these algorithms. The programs are designed to be modular, so that individuals have the option to develop and implement other methods that may be more robust than described here. We will remain open to releases of new versions should such techniques be constructed and verified.

ISTI’s online directory provides further details on the merging process and other aspects associated with the full development of the databank as well as all of the data and processing code.

We are always looking to increase the completeness and provenance of the holdings. Data submissions are always welcome and strongly encouraged. If you have a lead on a new data source, please contact data.submission@surfacetemperatures.org with any information which may be useful.

The broader context

It is important to stress that the databank is a release of fundamental data holdings – holdings which contain myriad non-climatic artefacts arising from instrument changes, siting changes, time of observation changes etc. To gain maximum value from these improved holdings it is imperative that as a global community we now analyze them in multiple distinct ways to ascertain better estimates of the true evolution of surface temperatures locally, regionally, and globally. Interested analysts are strongly encouraged to develop innovative approaches to the problem.

To help ascertain what works and what doesn’t the benchmarking working group are developing and will soon release a set of analogs to the databank. These will share the space and time sampling of the holdings but contain a set of known (to the originators) data issues that require removing. When analysts apply their methods to the analogs we can infer something meaningful about their methods. Further details are available in a discussion paper under peer review [Willett et al., submitted].

More Information

www.surfacetemperatures.org
ftp://ftp.ncdc.noaa.gov/pub/data/globaldatabank

References
Rennie, J.J. and coauthors, 2014, The International Surface Temperature Initiative Global Land Surface Databank: Monthly Temperature Data Version 1 Release Description and Methods. Accepted, Geoscience Data Journal.

Willett, K. M. et al., submitted, Concepts for benchmarking of homogenisation algorithm performance on the global scale. http://www.geosci-instrum-method-data-syst-discuss.net/4/235/2014/gid-4-235-2014.html

Filed Under: Climate Science, Instrumental Record

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 11
  • Page 12
  • Page 13
  • Page 14
  • Page 15
  • Interim pages omitted …
  • Page 25
  • Go to Next Page »

Primary Sidebar

Search

Search for:

Email Notification

get new posts sent to you automatically (free)
Loading

Recent Posts

  • AI/ML climate magic?
  • Unforced variations: Jan 2026
  • 1.5ºC and all that
  • Unforced Variations: Dec 2025
  • Who should pay?
  • Site updates etc.

Our Books

Book covers
This list of books since 2005 (in reverse chronological order) that we have been involved in, accompanied by the publisher’s official description, and some comments of independent reviewers of the work.
All Books >>

Recent Comments

  • Data on Unforced variations: Jan 2026
  • Data on Unforced variations: Jan 2026
  • Data on AI/ML climate magic?
  • Data on Unforced variations: Jan 2026
  • Data on Unforced variations: Jan 2026
  • Piotr on Unforced variations: Jan 2026
  • Nigelj on Unforced variations: Jan 2026
  • BJ Chippindale on Unforced variations: Jan 2026
  • Ron R. on AI/ML climate magic?
  • Susan Anderson on 1.5ºC and all that
  • jgnfld on AI/ML climate magic?
  • Ron R. on AI/ML climate magic?
  • Ron R. on AI/ML climate magic?
  • Ron R. on AI/ML climate magic?
  • Data on 1.5ºC and all that
  • Barton Paul Levenson on 1.5ºC and all that
  • Barton Paul Levenson on 1.5ºC and all that
  • Pete Best on 1.5ºC and all that
  • Atomsk’s Sanakan on Unforced variations: Jan 2026
  • Atomsk’s Sanakan on 1.5ºC and all that
  • Susan Anderson on AI/ML climate magic?
  • Data on Unforced variations: Jan 2026
  • Keith Woollard on 1.5ºC and all that
  • Piotr on Unforced variations: Jan 2026
  • Russell Seitz on AI/ML climate magic?
  • Data on AI/ML climate magic?
  • Atomsk's Sanakan on 1.5ºC and all that
  • Data on 1.5ºC and all that
  • Nigelj on Unforced variations: Jan 2026
  • Atomsk's Sanakan on Unforced variations: Jan 2026

Footer

ABOUT

  • About
  • Translations
  • Privacy Policy
  • Contact Page
  • Login

DATA AND GRAPHICS

  • Data Sources
  • Model-Observation Comparisons
  • Surface temperature graphics
  • Miscellaneous Climate Graphics

INDEX

  • Acronym index
  • Index
  • Archives
  • Contributors

Realclimate Stats

1,392 posts

15 pages

249,479 comments

Copyright © 2026 · RealClimate is a commentary site on climate science by working climate scientists for the interested public and journalists.