An exercise about meaningful numbers: examples from celestial “attribution studies”

We can divide the above time series into sequences with the same length as that L&S2001 used to fit their model, and we can then do a similar fit to these sequences (source code):

The synthetic time series split into 100-yr segments (grey), mimicking the sampling of L&S2011. The red curves show the best-fit of their model to these random series.

The red curves, representing the best-fit are all over the place, and they differ from one sequence to the next, although they are all part of the same original time series. But the point here is that we could get similar results for other frequencies, and the amplitude for the fits to the shorter sequences would typically be 4 times greater than a similar fit would give for the original 10,000 years long series (source code). This is because there is a band of frequencies present in random, noisy and chaotic data, which brings us back to our initial point: any number or curve can be split into a multitude of different components, most of which will not have any physical meaning.

Loehle & Scafetta also assume that the net forcing of the earth’s climate is a linear combination of solar and anthropogenic. The fact that our climate system involves a series of non-linear processes, as well as non-linear feedback mechanisms that may be influenced by both, suggest that this position may be a bit simplistic – it ignores internal, natural variability for instance.

Some of the basis of L&L2011′s analysis can be traced back to a 2010 paper that Scafetta wrote on the influence of the great planets on Earth’s climate. I’m not kidding – despite the April 1st joke and the resemblance to astrology – Scafetta claims that there is a 60-year cycle in the climate variations that is caused by the alignment of the great gas giants Jupiter and Saturn. This conclusion is reached only 3 years after he in 2007 argued that up to 50% of the warming since 1900 could be explained by 11 and 22-year cycles (a claim which Gavin and I contested in our 2009 paper in JGR).

The Scafetta (2010; S2010) paper presents spectral analysis for two curves which are fundamentally different in character – to quote Scafetta himself:

Spectral decomposition of the Hadley climate data showed spectra similar to the astronomical record, with a spectral coherence test being highly significant with a 96% confidence level.

The problem is immediately visible in the figure below:

Comparison between spectral analysis and the curves in Scafetta (2010)

For any spectral analysis, it should in principle be possible to carry out the reverse operation to get the original curve. S2010 presented a figure showing the alleged spectra for the terrestrial global mean temperature and the rate of motion of the solar systems centre of mass. It is conspicuous when spectra of two very different looking curves appear to produce similar peaks.

So how was the spectral analysis in Scafetta carried out? He used the Maximum Entropy (MEM) method, keeping 1000 poles. The tool he used was developed by Michael Ghil and others, and the method is described in one of the chapters of a text book by Anderson and Willebrand (1996).

Page 3 of 4 | Previous page | Next page