I have already used the ERA-interim data for a study on the influence of the Saharan Air Layer (SAL) on hurricane development. I originally started the project with GFS analysis/forecasts, but found that the ERA-Interim provide a much better set of initial conditions. A comparison of WRF initialized with ERA-interim and dropsonde data versus WRF initialized with GFS and dropsonde data showed that using ERA-interim data for initialization data produced a more realistic hurricane than did the GFS initialization. There were a few hiccups when we first started using ERA-interim, but the extra work was worth the effort in producing a realistic storm
This interests me greatly. My agency models salmon run sizes and migration behaviour for fisheries management purposes. Our tools have been changing over time (changes in fishing gear, hydroacoustic technology, stock identification methods, etc). We are struggling greatly to learn about the fish over time (comparisons across years) because our assessment tools have been changing over time. Despite this confounding influence we dare not throw up our hands in frustration because, well, we need to try to inform fisheries decisions somehow. We have a further confounding influence, in that the fish themselves could have evolved (as opposed to physical principles). Our solution to date has been to keep restricting our analyses to recent time periods, so we suffer chronically from low sample sizes in these efforts. Perhaps investigation of these Reanalysis approaches can be informative for us. Thanks for posting about it.
One objective of this (his?)new WCRP is to “help improve and promote sound data stewardship, including data archiving, management, and access. This includes making sure that climate-related data variables are reaching data archives, and that standards are set for archiving new types of data. Help make data accessible and available e.g., through the internet. Promote shared efforts for data quality control.”
The Climate establishment knows all is not well and this looks like their new project. A confession?
Thanks for a generally accurate description of reanalysis, its advantages, and pitfalls. One important benefit of reanalysis that you didn’t mention is that it can provide very useful feedback about the quality of the input observations themselves. For example, the 20th Century Reanalysis includes many surface pressure observations that have never been used before. The time series of residuals at station locations are now available for statistical analysis. Shifts and sudden changes in the series may indicate possible problems with the data. These can be checked against station information to account for changes in measurement method, errors in station height, etc. In a similar way, errors in the radiosonde temperature record have been identified using residuals from the ERA-40 reanalysis (Haimberger 2007, http://journals.ametsoc.org/doi/abs/10.1175/JCLI4050.1).
Reanalyses were originally developed in NWP centers in the early 80s, to help them improve their forecasting systems. Since then we have seen a very positive feedback cycle: improved observations, improved reanalyses, improved models, etc. Science at its best.