# Naturally trendy?

From time to time, there is discussion about whether the recent warming trend is due just to chance. We have heard arguments that so-called ‘random walk‘ can produce similar hikes in temperature (any reason why the global mean temperature should behave like the displacement of a molecule in Brownian motion?). The latest in this category of discussions was provided by Cohn and Lins (2005), who in essence pitch statistics against physics. They observe that tests for trends are sensitive to the expectations, or the choice of the *null-hypothesis *.

Cohn and Lins argue that *long-term persistence* (LTP) makes standard hypothesis testing difficult. While it is true that statistical tests do depend on the underlying assumptions, it is not given that chosen statistical models such as AR (autoregressive), ARMA, ARIMA, or FARIMA do provide an adequate representation of the null-distribution. All these statistical models do represent a type of structure in time, be it as simple as a serial correlation, persistence, or more complex recurring patterns. Thus, the choice of model determines what kind of temporal pattern one expects to be present in the process analysed. Although these models tend to be referred to as ‘stochastic models’ (a random number generator is usually used to provide underlying input for their behaviour), I think this is a misnomer, and think that the labels ‘pseudo-stochastic’ or ‘semi-stochastic’ are more appropriate. It is important to keep in mind that these models are not necessarily representative of nature – just convenient models which to some degree mimic the empirical data. In fact, I would argue that all these models are far inferior compared to the general circulation models (GCMs) for the study of our climate, and that the most appropriate null-distributions are derived from long control simulations performed with such GCMs. The GCMs embody much more physically-based information, and do provide a physically consistent representation of the radiative balance, energy distribution and dynamical processes in our climate system. No GCM does suggest a global mean temperature hike as observed, unless an enhanced greenhouse effect is taken into account. The question whether the recent global warming is natural or not is an issue that belongs in ‘detection and attribution’ topic in climate research.

One difficulty with the notion that the global mean temperature behaves like a random walk is that it then would imply a more unstable system with similar hikes as we now observe throughout our history. However, the indications are that the historical climate has been fairly stable. An even more serious problem with Cohn and Lins’ paper as well as the random walk notion is that a hike in the global surface temperature would have physical implications – be it energetic (Stefan-Boltzmann, heat budget) or dynamic (vertical stability, circulation). In fact, one may wonder if an underlying assumption of stochastic behaviour is representative, since after all, the laws of physics seem to rule our universe. On the very microscopical scales, processes obey quantum physics and events are stochastic. Nevertheless, the probability for their position or occurrence is determined by a set of rules (e.g. the SchrÃ¶dinger’s equation). Still, on a macroscopic scale, nature follows a set of physical laws, as a consequence of the way the probabilities are detemined. After all, changes in the global mean temperature of a planet must be consistent with the energy budget.

Page 1 of 3 | Next page