This month’s open thread for climate science discussions.
Michael Doliner says:
2 Mar 2011 at 10:53 AM
Heat (energy) from the Sun may be stored in photosynthesis process or in other chemical processes.
So, say we calculate all the stored energy released by burning of fossil fuels.
How does the energy released/day compare with the energy received each day from the Sun?
ccpo #97, re: Snapple’s Russian video with Prof. Kirpotin,
Briefly, it goes something like this (caveat — my Russian’s rusty — corrections welcome):
They first talk about the “Rusalka” (Mermaid) experiment/instrument on the International Space Station, which maps CO2 and methane in the atmosphere to get a better idea of the sources. It is said to be unique in the world.
Venus graphics: The narrator says that, if the buildup of greenhouse gases continues, Earth is predicted to turn into a second Venus. (Ouch.)
Cut to Kirpotin, who talks about permafrost melt in Northwest Sibera. He describes a local albedo feedback: the ground used to be covered by snowy white lichens, which give way to browner, darker, more light-absorbing patches as the ground thaws. Above some critical point, he says, the processes of darkening and thawing become self-reinforcing. While peat bogs in the south play a positive role as carbon sinks, the northwest Siberian bogs have become methane emitters, increasing atmospheric concentrations.
The narrator says Siberia has warmed 3 °C in just 40 years. Economic effects are mentioned: the narrator mentions stormier weather, and Kirpotin explains how the winter roads on which the gas industry depends are lasting shorter and shorter. It ends with the narrator wondering aloud whether the main culprit is man or nature (ouch), and saying that scientists disagree, but stressing that climate change is real and GHGs are increasing.
Something like that, anyway. Pretty good reporting, I think, except for the Venus scenario and the bit of faux balance at the end.
If you want background, Google Scholar for Kirpotin + permafrost, thermokarst, methane, or similar should do it.
Did we have nuclear winter in the 50’s?
Fujii JASTP 2011
CCPO–I have posted the translation below–more or less. I don’t know every word, but I get the idea.
Here is a post that explains the video and has a Russian explanation at the bottom of my post. This satellite is called Rusalka–Mermaid. It’s a Russian fairy-tale character and an acronym–see my post for details and documentation.
This post also has Kirpotin in a different with English subtitles discussing the permafrost. If I ever get to Tomsk, I will be sure to look him up!
Here is a translation of the gist of the video about Rusalka–Google translation from Vesti, which had the same video (more or less).
BEGIN QUOTE (or see this on my post with Google translation into English):
On the International Space Station began an experiment under the fabulous name “Mermaid” . Purpose – to measure greenhouse gases in the Earth’s atmosphere. Methane, carbon dioxide and water vapor have real impact on global warming.
“Mermaid” – an acronym. Full name of the device – Hand spectrum analyzer components of the atmosphere. Exactly the same now in orbit. With the help of astronauts will photograph the Earth’s surface. The Institute for Space Studies, said analogs “Mermaid” in the world no.
“The problem is not something to measure the total content of CO2, it is measured. The task to identify the sources and origins of these gases” – explains technical manager of “The Mermaid” Alexander Trokhimovskii.
A similar device today hovers above Venus. There’s greenhouse gas emissions in excess, the real space “bath”: a thick veil of clouds and the temperature of hundreds of degrees. According to forecasts, if the accumulation of greenhouse gases in Earth’s atmosphere does not stop, our planet could become a second Venus.
“Icy surface of the swamp has recently been covered in very light, almost white lichen. We have assumed that as a result of climate warming of the surfaces of dark brown or lakes, which also absorb the sun’s rays will be more and more increasing. And there are some critical threshold and dark brown surfaces, at which the process of thawing permafrost will spur himself. The more it becomes dark, the faster the melting – tells about the mechanisms leading to the greenhouse effect, Sc.D., vice-rector of Tomsk State University Sergei Kirpotin. – It is evident that this was originally a white surface occupied by lichens, which are like a snow-white snow reflects the sun’s rays, are beginning to rapidly emerge young lake. ”
Biologists at Tomsk State University has long alarmed. Global warming – not a fictional story. Examples underfoot: the swamp. In Western Siberia, it’s almost half the territory. Here and in the largest swamp in the world – Vasyugan.
Here it is, the heart of Siberia, swamps. Without special protection from mosquitoes and Bolotnikov nothing to do here. Before you go here, we were warned: just walk on the grass in the center of the trail is not offensive – it will tighten.
In the climate regulation marshes play a major role. Scientists call them “refrigerators – coolers. Plants absorb carbon dioxide from the atmosphere, turn into peat, and a greenhouse gas, “preserved.” In the south western Siberia bog in such a scenario and “live”, but in the north, the permafrost zone, the situation recently changed for the worse.
“So I jumped and starts to separate gas, methane, the strongest greenhouse gas – is experimenting in the middle of the swamp Sergei Kirpotin.-As a result of a landslide that began thawing of permafrost peatlands in the area, the atmosphere starts to separate large amounts of methane. Unfortunately, this gas has 30 times more powerful greenhouse gas than carbon dioxide. And today, on the scales two processes: the positive role of wetlands in the south, as cooler air, and their negative role in the north of the West Siberian Plain.
The expedition to the swamps local environmentalists go several times a year. According to the observations, the melting so rapidly that it is impossible to stop him. In Siberia is warming faster than anywhere else on the planet. For forty years the temperature has risen by three degrees. Violent storms, which had never been here, today weather report. Many are also linked with global warming. These three degrees cause damage and the economy.
“Today, in some cases, winter roads are beginning to operate from late December, and even after the New Year. Can you imagine what progress over time?” This is a huge loss for the oil and gas industry, which is a significant part of their activities provides precisely such a winter roads “- counts economic losses from global warming to the economy of the region rector of Tomsk State University.
Who are the main culprit of climate change on Earth, and producer of greenhouse gases: a man with his plant and machinery or the planet itself? It is known that the change in climate for the Earth – a natural process. Were in history and glacial periods, and tropical heat. Should I worry? Scientists opinions diverge. However, one sure: the content of greenhouse gases in the atmosphere increases, and the planet’s climate is changing.
I think they have to say that wishy-washy bit at the end because global warming is politically sensitive because of the powerful gas industry. I have read a lot of Kirpotin, and he isn’t wishy-washy about what is happening.
He was the one Russian scientist who said that Climategate was a provocation. He didn’t say who might have been responsible.
Here is a really good video of Sergei Kirpotin explaining his ideas and concerns with English subtitles.
This is Russian Greenpeace. There are more of him on the right, I think. Search “Sergei Kirpotin” or “Сергей Кирпотин”.
> So, say we calculate all the stored energy released by burning of fossil fuels.
> How does the energy released/day compare with the energy received each day from the Sun?
Minuscule — four orders of magnitude less. The problem isn’t the heat energy from the burning of fossil fuels. It’s the added CO2, which means the world has to get warmer to radiate away as much solar energy as it receives. A doubling of CO2 gives a nearly 4 W/m2 forcing. The greenhouse heating from fossil-fuel emissions is two orders of magnitude greater than the direct heat of their combustion.
Regarding the upcoming results of the BEST Project, it will be interesting to see.
Confirmation or not by the BEST Project of the results of previous GST time series analyses, the BEST Project team promises to be more open than providers of past analyses. We see their clear message infers the promise of more openness with the data, methodology, code and related documents without the need for FOIs or being diverted by institutional and/or beaucratic data & code keepers. We shall see soon enough. If the promise of being more open than the past analyses is kept, that by itself, is a step forward on the GST time series science independent of the results.
John Whitman: How can you be more open than publishing the entire dataset and codebase? How can you make data turn around and say the opposite of what it represents?
I think these are far more relevant questions than yours.
All this self-serving nonsense about “openness” makes me sick. Hypocrites, begone!
It’s all just so tiresome. When BEST fails to produce anything of merit, the deniers will dream up some other project, all in the name of delay, delay, delay. Oh, sorry. In the name of “openness and sweetness and light and keeping the bad, bad scientist all honest and nice and open and nice and being absolutely certain and maybe we’re still just not quite certain and maybe not quite open enough and maybe we should try again because maybe we got the wrong answer the last dozen times and maybe we can drag this out just a little bit longer because open open maybe open open honest barf squiggle”.
[Response: Yes, it’s a completely skewed and false representation of the issue. You can take any science and make all kinds of noise about how important data are “hidden”, and furthermore, when that ruse is used up, you can formulate all manner of accusations about “data quality” issues, and then of course there’s all the supposed analytical issues; and round and round you go…Jim ]
Well, one important thing to bear in mind. It only takes a very short time (months) to analyse the global surface temperature record. If BEST takes longer than this to publish something preliminary, then they are delaying (much like that surface stations malarkey).
8 Mar 2011 at 2:19 AM
“I think these are far more relevant questions than yours.”
– – – – – –
Thanks for your response.
We will see very soon by direct comparison whether the openness behavior of the BEST Project compares favorably to previous openness approaches to GST time series analyses/products. We each can do our own evaluations on openess independently. If a new benchmark does happen to be established by the BEST Project Team, then all other previous teams producing GST time series analyses/products will benefit. We live in interesting times because of these possibilities in advancing climate science beyond its current state.
Kent Clizbe, that “ex-CIA operative” who emailed Dr. Mann’s co-workers and encouraged them to denounce him, has written an article on his blog smearing Dr. Christopher Field. [Clizbe is off-message because the real CIA has a Center for Climate Change and National Security whih is reportedly led by Larry Kobayashi.]
Now I see that Dr. Field is testifying in Congress.
Perhaps it is no coincidence that Dr. Field was vilified in very crude language (“his snout in the trough,” and “sucking off…the government teats”) right before his testimony. This is the kind of kompromat I am used to seeing in the Russian/Soviet media when a scientist is attacked. It reminds me of how they attacked Sakharov. It reminds me of how the Soviets accused the Pentagon scientists of making AIDS.
If I ran the CIA, I would make a statement that Clizbe’s views should not be associated with the CIA.
Kent Clizbe (2-22-11) states on his blog post:
“An initial review of Field’s background and snout in the trough seems identical to any number of “climate scientists” (Field’s scientific background is Biological Sciences. His PhD research was on Leaf Aging in a California Shrub) sucking off the National Academy of Sciences, NASA, NOAA, and other government teats.”
Kent Clizbe seems to be implying that Dr. Field doesn’t have the credentials to be a climate scientist, but scientists who study climate change come from many different fields.
Good CIA officers check their facts. The former spy Kent Clizbe doesn’t even manage to ferret out the fact that the National Academy of Sciences is not a federal agency. It is a federally-chartered corporation that advises federal agencies.
A revolution is underway according to Roger Pielke Sr. New research has been published which “may change how we view the climate system”. A bold claim indeed. Where has this incredibly important, paradigm shifting research been published? Er, on Judy Curry’s blog. Riiiiiight.
This follows hot on the heels of Roger’s entirely ignored attempt to completely redefine climate sensitivity.
When Dr. Santer pointed out to McIntyre that raw data were freely available online, the latter wanted intermediate results–and issued FOIAs to obtain them, along with two years worth of emails.
When GIS provides adjusted temperature data–intermediate results, basically–D’Aleo and Watts complain that they need the raw data.
In both cases, the methodology used to calculate the intermediate results is known.
Really, what more needs to be said? This is not about ‘openness,’ as many of the recipients of this harassment have long concluded.
Anyone who wants two years’ worth of someone else’s emails should be willing to make two years’ worth of their own emails public, IMO. The process of making emails public from only one side is inherently biased.
We will see very soon by direct comparison whether the openness behavior of the BEST Project compares favorably to previous openness approaches to GST time series analyses/products.
Spoken like someone who parrots what one has heard, rather than checked for himself. You were given links to code and data.
The BEST project people will, if they use that tiny portion of national records made available only under license by the various countries involved, have to obey the terms of that license just like CRU. Yet I’m sure the denialist community will give them a free pass if they do.
Something like 95% of the data used by CRU is available at GHCN, and GISTemp only uses the publicly available data. If you want the raw data from GHCN, you can acquire DVDs of scans of the original paper documents – handy if you think the conspiracy runs so deep that people have been altering the data when it’s transcribed from paper to data files.
The source to GISTemp is available, has been rewritten in Python by interested software engineers (and given back to NASA), and several independent people have applied their own statistical analysis to the data (coming up with essentially the same trends as GISTtemp, in the process).
Will BEST compare favorably? Well, if you mean … will BEST be *as open* as NASA GISS, then that’s an open question. I haven’t seen the BEST people promise to release their source code (as NASA does), for instance – just the data and algorithm descriptions.
BEST probably *will* compare favorably to the denialist’s favorite temperature reconstruction – Christy and Spencer’s UAH satellite reconstruction efforts. So does GISTemp.
Because Spencer and Christy haven’t open-sourced their code.
Here’s a question: why aren’t you attacking them for their lack of openness?
There is a lot of coal in Western China:
Peabody is helping them to extract it.
Came cross an article in Quadrant.
The intelligent voter’ guide to global warming
It looks like the effort of a scientifically literate person to find reasons to believe that climate sensitivity is low when his real reasons are ideological. Something the intelligent voter should not be fooled by but it could easilly fool someone who knows a fair amount of science but has not looked at this problem in detail
#68 Mike, you might want to be careful about some of Tamino’s predictions. If memory serves me right, some time ago he looked at the Central England data series, and predicted a increasing temp.
However looking at more recent CET (Ave1) data, and using three different time spans (10, 20 & 50 year or 0.1, 0.05 0.02 cycles/yr) filters, a different pattern is seen. In the period over about the past 10 years, the CET series seems to be fairly flat, or even a trend down.
Here is a my computed average of the 5 data sets used by Tamino, only covering the time span from 1850. This used a 10, or 0.1 cycles/yr., MOV, 2-pole Chev. “filtfilt” & a Fourier Convolution filter.
It has a couple of interesting features, including a gain a flattening over roughly, last 8 years. This latest apparent dip, is similar to the one in the 1040’s, and with some longer period filters, similar to the one in the 1875-1885 period.
Very interesting paper on melting ice sheets:
Melting Ice Sheets Now Largest Contributor to Sea Level Rise
I really hope we will be hearing more about this.
J Bob, # 118: Could you supply a working link for that Tamino post about central England temperatures? (BTW, I think Tamino’s blog history underwent some kind of reorganization a while back, as some of my casual attempts to find old posts have failed. Offhand, not sure what’s up with that.)
Tamino usually analyzes past data, and is sparing with predictions.
Even for the whole globe, 10 years is a very short time in the surface temperature record. Absolutely no one (with any sense) would conclude anything important from your “fairly flat” claim about CET for 10 years, over such a tiny area.
Back from the Dead: Lost Open Mind Posts
The spam filter’s blocking my attempt to post the actual link to the archive copy of the CET topic (which link isn’t working right now anyhow). Try there.
Re: #120 (Ric Merrit)
I hope the previous comment helps you find the post.
Re: #118 (J. Bob)
No, memory does not serve you right. I made no prediction. The closest I came was a statement starting with “If this trend continues, then …” You seem to have very conveniently ignored the “if” part.
As for your “analysis,” in my humble opinion it’s worthless. You’re just another of the “fun-with-Fourier” people who ignores the science called statistics. In fact I did a post about your (and others’) approach, which I refer to as mathturbation.
There is a draft paper available that considers an exponential form for the ice sheet mass loss rather that quadratic using GRACE data. http://www.columbia.edu/~jeh1/mailings/2011/20110118_MilankovicPaper.pdf
I don’t think there is a huge difference in 2050 compared to the article you link to, but there might be a substantial difference around 2100. If a logistic growth model is applicable, the exponential form for early sea level rise may be better than the quadratic form.
more progress in making butanol from cellulose:
Jim Boudin, Witsend wanted you to see this bit of news about ozone.
Skeptic site Climate4you is publishes this graph of average OLR, and asks why OLR does not decrease to increasing CO2. While Huang, Yi, V. Ramaswamy, 2009, show a more complex picture, I am somewhat surprized that there isnt a trend in OLR since 1974. What is wrong with this picture.
[Response: Link doesn’t work. But there is no measuring device over that time period that can give a reliable trend – so depending on where the data comes from it will have issues. – gavin]
Thank you for the link. I was just wondering why Hansen et. al. were using exponential fit rather than polynomial fit for the Grace data. As you mention it could be a big difference in the long run.
I understand that ice sheets are not well modelled yet, and that the Grace dataseries is too short for definitive arguments.
I think the paper gave some arguments for his model.
Am I right in understanding Hansens main argument for (a potential) exponential mass loss to be feedbacks from albedo-flip on the borders of the ice-sheet, lowering of the level of the ice and growing forcing from higher temperatures of the sea and air?
I have a friend who gets all his climate science from The Spectator and is always going on about Vikings in Greenland. I know this is complete BS and I know I’ve asked this before, but is there a site somewhere that explains why this is BS? I can’t recall whether it’s BS because the Vikings didn’t have much in the way of agriculture there anyway, or because the melt in Greenland now is much more extensive than in the MWP, or what it was.
Sorry, link is here. The site claims data is from noaa. This NOOA pages says temporal coverage from 1974/06.
[Response: it’s almost certainly from the NCEP reanalysis. The trends are corrupted by changes in the observing network and uncorrected biases in obs which makes these trends untrustworthy. If you look at the ERA-Interim, I’m sure it would look very different. – gavin]
Fossil Bird Study Describes Ripple Effect of Extinction in Animal Kingdom
Tamino, since your post was not up, I didn’t catch the “if” clause, but does that mean that you predicted the upward temperature trend?
However you comments ‘ ’fun-with-Fourier’ people who ignores the science called statistics.”, are most interesting. Enclosed is a comparison graph between the Empirical Mode Decomposition (EMD) & Fourier filtering, using the CRU data set from 1856 to 2004. In my humble opinion, it’s pretty good match, considering the EMD method is supposed to work for non-linear & non-stationary systems.
Since you seem to be into statistics, why don’t we add a little “stats” to the Fourier filtering process. Using the HadCRUT3gl data set, with three types of lo-pass filters. The MOV, forward-reverse “filtfilt” 2-pole Chev. & Fourier Convolution filter (Fc=0.1 cycles/year), one gets the upper graph on the picture below.
Looking at the lower graph, the difference (cyan curve), between the raw and filtered Fourier data, the filtered is pretty well centered in the middle of the input data. But since your into statistics, the mean and Std. Dev for this difference is labeled. Assuming a Gaussian distribution, that would give a 3 sigma filter error of about 0.32 deg.
In my humble estimation, it does a very good job estimating the longer term input data. Maybe you could could do better?
Re: #131 (J. Bob)
The truth comes out: you made claims about what I was supposed to have “predicted” without bothering to read what I said.
By all indications, your statistical knowledge is confined to “In my humble opinion, it’s a pretty good match” and computing a mean and standard deviation.
As for doing better, with four parameters I can fit an elephant and with five I can make him wiggle his trunk.
In my humble opinion, you are not educatable.
I also tried to link directly to the archived Central England Temperature post, only to have it hung out to dry by the Spam filter. Alas…
JBob would do well to read up on the D-K effect, because as it now stands he is not even wrong.
Hansen used albedo flip to neatly explain the timing of Milankovich cycles as having glacial retreat more tied to spring rather than summer NH warming. And, it is a positive feedback, but his main argument so far for the existence of a short doubling time in ice sheet mass loss is paleo-records of very rapid increases in sea level rise. And, he has argued that there is plenty of forcing to melt ice sheets rapidly.
There has been recent work on heat transfer from the surface of ice sheets to the interior on a decade timescale that might set a doubling time for mass loss owing to the changed structural integrity of the ice. http://adsabs.harvard.edu/abs/2010AGUFM.C11B..02R
But, I have not seen Hansen reference this work though I think he has something similar but less worked out in mind.
it would appear our Unforced Variations conversation has been relegated to the borehole. Still waiting for your graph.
I knew it had to be somewhere in the ‘tubes:
“What got my attention was a 2010 paper (I usually stay away from physics) titled “Drawing an elephant with four complex parameters” where the authors decided to try to reconstruct an elephant with four complex numbers and wiggle its trunk with a fifth ! This was attempted earlier in 1975 with least-squares Fourier series fitting but apparently that took 30 parameters. The figures in the paper show that they managed to do something close. http://2.bp.blogspot.com/-Xr2YfZrYDMM/TWdnBa6VzDI/AAAAAAAAD1M/cbOMZHKXQ8Q/s200/4points.png
My elephant clearly beats the math
I decided to write up a program and look at it for myself and have been able to get a passable elephant only with 8 parameters and that is 8 sets of 4 real numbers (a b c and d of Kuhl and Giardina which can be considered as two complex numbers) and so not as good as the results of the authors of the 2010 paper. Maybe my elephant is too complex for the math. In case you want to play around with the idea download (Windows unfortunately, but works in Wine) the small program that I wrote today from here – you can also find Matlab code here.” [working links in the original blog page]
To J. Bob: You just don’t get it. It’s well said that you’re “not even wrong.”
To those whose hope to understand time series data:
Time series analysis is not a contest to see who can fit a smooth curve to the data which leaves the smallest residuals. Any child can take that process to its limit, simply by fitting some curve with enough parameters to match the data exactly. The residuals are all zero but obviously such a curve is meaningless, a classic case of “overfitting.” It’s not a contest to minimize the residuals within some constraint of what one considers “smoothness” (like the cutoff frequency for a low-pass filter). Beware exercises in indulgent curve-fitting, aptly named “mathturbation.”
Nor is it an excuse to indulge in confirmation bias, contorting the data to resemble one’s preconception of what the result should “look like.” Alas, this is all too common a sin (even among the otherwise highly qualified), usually committed by those who wish to proclaim that a trend has turned in some favored direction.
It’s a sure sign of ignorance (and often of insecurity) to brandish technical terminology like a weapon when, to those in the know, it’s obvious one is trying to tighten a screw with a drill bit. Especially in a forum in which many or most are not mathematicians or statisticians, showing off how many filters you can name and referring to a simple moving average as “MOV” rather than “a simple moving average,” is the sort of arrant pedantry up with which we should not put. Be not impressed, intimidated, or persuaded by such displays. Real genius has a gift for clarity, a prize which is rightly cherished both for its beauty and its rarity.
If you want to do time series analysis, take the time to learn time series analysis. Familiarity with signal processing, and reading “How to Lie with Statistics” (however entertaining and informative it may be), do not suffice. Show some respect (and humility) approaching the subject. It’s not something you can learn overnight, getting good at it takes longer still, and no, you can’t just fly by the seat of your pants based on your incredible genius. Work at it, you may be well rewarded. And believe it or not, a lot of really smart people have already discovered a lot about the subject — you’d be well advised to learn from them.
One of the most basic and useful goals is to understand, from studying data, the nature of what we might call the signal (which is, basically, the deterministic part of our data) and what we might call noise (which is, basically, the random part). The two behave differently — dramatically so — in diverse and intricate ways, but teasing them apart is far from easy. Yet doing so, when drawing conclusions about what trends may exist in hopes of a glimpse of the future, is often essential. No, the curve you’ve fit isn’t the signal and the residuals are not the noise. You’re damn lucky if they’re even a decent approximation.
It’s just too easy (and common) for noise rather than signal to rule your model-building exercises, and in consequence for you to convince yourself that you’ve hit the bulls-eye when you’re not even in the target. Fitting models is too damn easy, especially with multiple parameters to adjust, while finding reality is hard — so much so that John von Neumann responded to a colleague’s claim about the significance of a result with a now famous quip, that “with four parameters I can fit an elephant and with five I can make him wiggle his trunk.”
Finally, don’t fall in love with statistical models when studying physics. Nature usually refuses slavishly to follow simple models which are statistically tractable. As much as I love statistics, and time series analysis, when it comes to the physical world, physics rules.
#137 tamino says,
“when it comes to the physical world, physics rules”. Unfortunately we do not know all the subtle details of the rules. Speaking of “time series data”, what in heaven’s name do most real time control systems use? The next time you are in a aircraft, your future might depend on one of the systems I helped design.
But back to the point, if you are going to predict the future, one had better have a good picture of where one came from, and is, (Norbert Wiener). The example presented is a way to illustrate estimation of real world status. And also a method to determine it’s figure of merit, in relationship to global temperature.
So, just present your work, a single picture would be preferable, and let the readers decide.
> in a aircraft, your future might depend on
> one of the systems I helped design.
We know, with the aircraft systems in use, that the designers have failed to cover the range of problems that happen. Oddly, that’s the same sort of problem you seem to be having understanding the problems being warned of by climate science.
Who’d’a thunkit, eh?
At work, you can blame your managers who told you the design limits to consider.
Here, you should be listening to people who are warning you that the limits are changing.
“It is widely accepted in the industry that the most recurrent feature of most large-airplane commercial air accidents worldwide in the last few years has been loss of control…. Since loss of control is now prominent amongst probable causal factors of accidents, it seems to me obviously worthwhile to perform this research…. the NTSB’s concern in the Denver report is with situations that could be veridically modeled in flight simulators but currently are not. That could be, and probably should be, fixed.”
JBob says we don’t know all the rules. Quite so. However, that does not mean we should start by ignoring the rules we do know–like the fact that CO2 is a greenhouse gas that will cause the temperature to rise (most likely) somewhere between 2 and 4.5 degrees per doubling. That simple fact alone is sufficient to account for the upward trend in temperature–no recourse to putative “oscillations” needed.
What is more, CO2 explains not just the global temperature increase, but how the increase is distributed spatially and over time (daily, annual), as well as stratospheric cooling. By ignoring the physics, you open yourself up to seeing all sorts of patterns that are not there.
P. Lewis points here
to a Wolfram demo on fitting an elephant.
#139 Hank, #140 Ray
my point to was that I’m well aware of time series analysis, having been involved in A/C and space vehicles, starting with the X-15. However the point is that, tamino has a problem with the graphs I presented. So all he has to do is show a comparison on the HadCRUT3gl raw & smoothed data set, and see who might have the better potential for extrapolating forward.
Bringing up Denver A/P, & Ray talking about CO2 is not the issue.
[Response: Ok enough. It is clear that the difference between curve fitting and future predictions are completely lost on you, and everyone has given up trying to educate you on this. (Clue: if the degree to which a historical curve is fitted by arbitrary functions was the key to predictability we’d just have gone straight to high order splines and we’d be done. Since we haven’t done that, perhaps you might want to think about why). In any case, no more on it here. – gavin]
Does anybody know a good reference on the Composite-plus-Scale method for combining proxies? I have only seen descriptions with lots of word (e.g. Ljungqvist, 2010, which however is a good summary of the method).
I would like to see a description with lots of equations, especially on how to determine the standard deviation for the reconstruction.
Ljungqvist 2010 does not describe that part in details, and I have not been able to reproduce his errorbars for the reconstruction (but I was able to replicate the reconstruction it self, as well as fig 2.).
Journalism question here. Suppose you’re interviewing someone who’s a climate doubter, & after he says climate science became politicized (which would mean there’s pressure to ‘conform’), you ask him why he thinks this would have happened, & he says, more or less, that the climatologists have fallen sway(?) to the lure of power*.
What’s your follow-up question?
With the benefit of several hours of hindsight, I think maybe it’s “exactly what kind of power do they now have, and do they seem to be enjoying it?”; but maybe there’d be a better one?
* caveat: I think I have the gist right, but have not yet transcribed.
…or maybe (continuing my previous comment) it’s (re “politicization”) “ok then, what evidence has been wrongly kept from publication in reputable journals?” (& if precluded there, did it go to E&E instead, & if so, why doesn’t it look convincing there?)
(Correction to #144, not “fallen sway to the lure of power”, rather “seduced by being close to power”.)
Anna #144, 145, If this person is worth interviewing, if they are in any way notable, I think the story is: “Noteworthy person demonstrates own stupidity, makes ridiculous allegations, has egg on face.” Anything else is false balance.
perhaps it would be a good idea to read some of Wiener’s work, before you get to critical about Fourier methods, signal conditioning and predicting future results. The methods I was using would be more correctly called signal conditioning.
Please can anyone help point me in the right direction? Are we expected to see an increase in earthquakes (magnitude or frequency) due to sea-level rise? All that extra water sloshing about etc. Wild speculation on my part, searching Google scholar hasn’t turned up anything…
Excellent post here on adaptation and agriculture, from a longterm excellent weblog:
Mathematical notation provided by QuickLatex
Powered by WordPress
Switch to our mobile site