It says that an irreversible decline in Arctic sea ice is unlikely this century, as the mechanisms they’ve modeled lead to a consistent recovery of sea ice in just a few years.
The abstract for the same paper at this AGU link puts a different emphasis on their findings, so now I’m wondering what the paper itself says, and what other experts think.
I also wonder how many contrarians will embrace this paper despite it’s conclusions being based on models that contrarians been decrying as unreliable all these years.
[Response: The issue is that *if* we reduced emissions and concentrations of anthropogenic greenhouse gases, sea ice would come back. i.e. what is happening is not technically irreversible. However, the recovery is predicated on reducing emissions and concentrations – it is not any kind of prediction. Rather it is a statement about how non-linear the regime is and on whether there are real and large scale tipping points in this system. The whole issue is moot in the absence of emissions reductions. – gavin]
Comment by Daniel J. Andrews — 1 Mar 2011 @ 9:51 AM
Speaking of extremes, has there been a discussion of the Royal Society publication that recently came out full of studies about what the consequences of a 4 degree rise in global temperature will be?
As an IT professional, I have some idle curiosity about the requirements to run climate modes. How much computing power is needed for a decent simulation? What kind of computers are used? Traditional supercomputers? PC clusters? I also wonder how long it takes for a 100 year model run. Days? Weeks? Months?
[Response: State-of-the-art models generally take a month or so to run 100 years (and this has been true for decades), but what is considered ‘state-of-the-art’ changes quite quickly as a function of computing resources. Given the speed of development, you can generally run ‘state-of-the-art’ models from a few years ago on a present day desktop (or even a laptop). However, given the amount of data these models output, you might want to be a little selective in exactly what kind of simulation you want to do (fully coupled oceans are expensive, as are interactive aerosols and chemistry, but AMIP-style AGCM simulations are relatively easy). For instance, the speed up of the GISS AR4 model over our AR5 model is around a factor of 10. – gavin]
Looks like we just may have passed peak ice extent in the northern hemisphere on February 17. To me it looks like there’s less than a 50% chance that there will be any further peaking. And the ice is quite thin , espcially considering this was a La Nina year.
It seems unlikely that we have passed peak ice extent this early in the season, althought it is certainly possible. Each of the past six years peaked in March, as is generally the rule (last year actually peaked in April). If you look at the chart below, you can see several bumps and dips each year which are likely due to changes in wind or ocean currents. If it were to have peaked a few days ago, then it would be the lowest recorded maximum ice extent (slightly below both 2006 and 2007). The final verdict will not be in for another month yet.
In response to the Nature article, I am inclined to agree with them. Much of the loss in sea ice has occurred in the open waters were the Arctic and Atlantic Oceans meet. Mixing of the warm Atlantic resulted in significantly melting of the exposed ice. The remaining sea ice rests in colder Arctic waters and further from the warm currents.
I found this article very hard hitting against the AGW contrarian crowd,
not because it is loud, rude, and direct to the point, but because it props up
clear compelling reasoning against the minority of people who attack science
for breakfast before they cut funds for it come lunch time.
I also am finding weird extreme arctic tropopause heights regularly, despite finally normal seasonal temperatures returning,
this is highly unusual, and comes with high RH just at the tropopause. Raypierre may explain,
but its been discussed a bit, as high 500 mb heights anomalies:
of which high Arctic tropopause heights have persisted, even after January’s incredible surface temp anomalies
have subsides. Tropopause heights are very high along with RH even though seasonal temperatures
have returned for the first time in months.
I have never seen this. Only Raypierre may have an idea???.
High Arctic tropopause heights have persisted despite the return of seasonal Arctic temperatures in February.
This is Highly unusual. never seen this before. Its OK for the Tropopause to be high when surface temps are +7 to +30 C above normal
(as it was in some high Arctic stations in January) .
But the trop is as High as 10 K when surface temps are -30 to -35 C. May be Raypierre can explain..
Thanks Gavin, that was interesting. A running time measured in months is surely a big incentive to good software management! Sounds like there is the opportunity for big speedups in proper matching of the software with specialized parallel hardware – but that has probably already looked into (and it’s enough to get a good-sized team’s hands full).
Wish these threads could be weekly or remain going the whole month. Got to get here quick or everyone has left.
Getting back to the rain in Spain falling only on the plain from a while back, I took a look at the GISS model E precipitation output for 8 times carbon dioxide and it looks as though precipitation falls for most mountain ranges aside from the Andes. http://data.giss.nasa.gov/cgi-bin/cdrar/effij.py
Spain, unfortunately, gets substantially less rain everywhere.
So, as a result of the ‘accidental’ arrangement of continents, the effect of warming may be to reduce the amount of new silicate rock surface exposed by physical weathering and available to chemical weathering. It is not clear that this fully counteracts faster chemical weathering reaction rates at higher temperature, but if it does, then the limestone thermostat might be broken and the Venus Syndrome discussed in James Hansen’s new book not halted by that mechanism. Based on the rightmost carbon dioxide point in fig. 30 of that book, I deduce that a sixteen times carbon dioxide run has been conducted which was not reported in the 2005 Efficacy paper. I’m preparing to try to get that output to see if the precipitation in the mountains continues to fall with more warming.
Surly new rock surface will be exposed in Greenland and Antarctica as ice sheets are eliminated which should also affect the limestone thermostat.
Sounds like there is the opportunity for big speedups in proper matching of the software with specialized parallel hardware – but that has probably already looked into (and it’s enough to get a good-sized team’s hands full).
Gavin’s pointed out that their models have taken about a month to simulate about 100 years for “decades”.
The point is that as computer power continues to increase, they add more to the model or increase resolution to take advantage, rather than simply run the same model in shorter time.
The goal being skewed towards “better” rather than “faster” …
Re: #5 – that should have been Feb 27, not Feb 10; my typo.
and #10: It seems unlikely that we have passed peak ice extent this early in the season, althought it is certainly possible.
I’m basing my guess on three facts:
(1) On the nsidc graph, the peak usually comes in the first 2 weeks of March,
(2) In the last two days, the extent has declined such that it would be difficult to make up in the next 2 weeks.
(3) There’s a lot of abnormally warm air over the ice edges (and the pole as well), as shown here . This has always, in my observation, preceded ice loss.
I read a really interesting article in the November 2010 issue of The Atlantic about a “meta-researcher” by the name of John Ioannidis who has found that most of what is published in the field of medical and nutritional research is “misleading, exaggerated, or flat-out wrong” (http://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-and-medical-science/8269/#). As the article describes, Ioannidis “charges that as much as 90 percent of the published medical information that doctors rely on is flawed,” and the article goes on to explain the many areas where the errors are coming from. Mostly they come from some form of researcher bias.
Although I think the science for climate research is solid, and I am a regular reader of Real Climate because I think it is the most trustworthy voice on climate change, if there was one area that I thought climate science might be vulnerable to criticism, it would be in the areas discussed in this article, only because this is a potential weakness that faces all of science. This article brought to mind a couple of questions:
1) Are there any meta-researchers in the field of climate science, similar to John Ioannidis, who look at the science being published to make sure the methods are robust and the papers don’t suffer from the problems identified in this article?
2) Is there something about climate research that makes it different from medical and nutritional research such that it is a lot less likely to face the same problems?
For example, in drug research there are billion dollar incentives for finding that a particular drug has some beneficial effect. I don’t really see the same distorting incentives in climate science (although climate science deniers like to claim otherwise).
Alfio — related to your question, there is a grid computing project at http://www.climateprediction.net that allows participants to contribute computing time on their own home computers to run climate simulations in the background or during computer down time. You might take a look at the project and see what information they have vis a vis their models and hardware requirements.
The sea ice around Ross Island (Antarctica) does break up — William (Stoat) said he hasn’t heard anyone associated with the Antarctic bases expressing surprise. Sounds like more of an inconvenience.
Here’s a quite old source showing the ice edge over 1956-1962 (it’s quite variable; compare the map there to contemporary maps to see the airstrip location on those, it’s around the area indicated as having pressure ridges I think); found searching Scholar for \ross island\ breakup \open water\
ICE BREAKOUT AROUND THE SOUTHERN END OF ROSS lSLAND, ANTARCTICA
AJ HE1NE – New Zealand Journal of Geology and Geophysics –
… During the spring and summer this fast ice gradually breaks up and floats out to sea … The break up of ice in McMurdo Sound, photographed on 16 January 1962, looking north-east towards Ross Island. …
Cited by 15 http://books.google.com/books?hl=en&lr=&id=eiE4AAAAIAAJ&oi=fnd&pg=PA395
1. If done properly, at best (no pun intended), it will largely reconfirm what is already known.
2. If done badly, it will create major confusion and more mess for climate scientists to clean up.
3. Any minor variances in trends, graphs etc will set off a new denialist frenzy.
4. Richard Muller, the spokesperson about the project, continues to make ‘hockey stick is broken’ claims. Hardly an open and inquiring mind.
5. The ‘team’ on BEST has quite a few members who are also business associates of Richard Muller through his company Muller & Associates. These include:
– Elizabeth Muller
– Arthur Rosenfeld
– Jonathan Wurtele
6. Judith Curry is the climate expert on the group. I’ll say that again. Judith Curry is the climate expert on the group. I think the nicest thing I can say is that it perhaps would have been wiser to obtain the services of a climte scientist with experience in this area, if the project wished to promote its credentials.
7. Funding in part comes from a Koch co fund. The perception here is bad. Judith Curry said they thought it was a good idea to get money from Gates and Koch as some sort of balancing act. Well this has backfired.
8. Not all funding sources are named. This is a bad look.
In case anyone is interested, the Australian (Labor minority) Government have announced that they intend to legislate some sort of carbon price to be implemented mid-2012. The conservative opposition Liberal/National Party are opposing it tooth and nail.
The plan is a (yet unnamed) fixed price for 3-5 years followed by a full cap-and-trade system. Not yet announced are concessions to household and industries. The previous scheme, the Carbon Pollution Reduction Scheme was rejected by the senate in 2010 and saw the fall of both the sitting Prime Minister and Opposition Leader over the issue.
No doubt there will be some noise on this over the next little while, so brace yourselves!
OK, I don’t know either Muller, but Art Rosenfeld is a pretty reasonable guy, and in my interactions with Wurtele, he seemes fairly reasonable as well. Neither of them has any particular expertise in climate. OTOH, it would be nearly impossible to make the warming signal go away without outright fraud
Also now at BEST, Robert Rohde — he has been contributing to climate education for years, turning out useful charts at the same time he’s been working on his PhD.
“Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. …. Rohde has gathered records from 39,340 individual stations worldwide….
Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.
Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says….”
34 Ray. I’m not sure if Muller is in the NAS, I guess maybe not, but in his book “Physics for Future Presidents” he said that he reviewed the hockey stick for the NAS . He said words to the effect that it is warmer now than in the last 400 years with certainty (or near certainty) and the certainty goes down with time further into the past but that it is likely warmer now than in something like 1000 years. I found him annoying. I’m not sure how much misinformation the book contains in general but it there were significant gems in there, such as reactor grade plutonium can’t be used to build bombs. If I had to choose between Burton Richter’s Beyond Smoke and Mirrors and Mullin’s Physics for Future Presidents, Richter wins hands down. Richter covers energy and climate. Mullins covers energy and climate plus lots of other stuff. I didn’t get the impression that Mullins had actually researched his subject. To me it seemed a lot like two physicist’s in a bar late at night trying to show each other up by pontificating on subjects they know nothing about.
Comment by John E. Pearson — 1 Mar 2011 @ 10:20 PM
I dont make any comment on whether or not Rosenfeld and Wurtele are straight shooters, only that they are also listed as business associates of Muller & Associates on his consulting website. My perception is that it all looks very clubby – something that Muller and Curry have been quick to denounce in other areas.
“Another option is that we could learn to live with global warming. Despite claims to the contrary, storms aren’t increasing. The rate of hurricanes hitting the U.S. coast has been constant for a century, and the number of damaging tornadoes has been going down. Will Happer, a former director of research for the Department of Energy, argues that additional CO2 may have helped the agricultural revolution. And chilly Berkeley might be nicer with a few degrees warming.”
I’d say so far, I’m agnostic on BEST. Unless it is a complete fraud or done with utter incompetence, it should show warming comparable to the other 5 indices. If it does not, I am sure that it will be exposed for what it is–though this will take time. If it does show comparable warming, the denialists will denounce it as a fraud even if they put it together. The web-trolling seems to me to be a novel but risky strategy. One wonders how much they know about the data series they are using.
The thing about nature is that she tends to give consistent answers as long as you ask the right questions.
The whole idea of advising Presidents is a rather fraught proposition. Most aren’t interested in science. I think we need a book that advises the people about what they need to know about science–maybe then they’d elect better Presidents.
With respect to the BEST Team, here is what I would conjecture at this point time given the pre-PR PR, and the “proported” involvement of you know who at WUWT? infamy;
Researchers doing what researchers do, they will likely want to make some “bold” statements about their analyses. I mean, if it’s the same old same old (agrees with the existing global SAT from “the establishment”), it’s not going to be newsworthy in the least, and will be seen by the MSM as a complete waste of time (IMHO).
I’m also not sure what it means to include all of the data from all stations, as we pretty much know that there is a very high degree of spatial autocorrelation, and that the spatial density of SAT sites is highly biased (concentrated) in urban areas (e. g. the dreaded UHI will rear it’s ugly head).
The BEST Team will do the usual; land vs ocean SAT’s (look see land is warming more that oceans), rural vs urban SAT’s (look see urban is warming more than rural via UHI), the usual geopolitical regions (look see CONUS is different than the rest of the world, d’oh), decadal (and longer) natural variabilities (look see ocean indices and ENSO and volcanos, oh my) junk.
If the BEST Team does not publish there results in the highly respected peer reviewed climate science literature, we can pretty much conclude that this BEST Effort was largely political in nature, and not aimed at any improvement in our current understanding of the Global SAT temperature record.
My thoughts (call it a hunch) are that the BEST Team will try to claim higher uncertainty in the global and regional SAT’s than is commonly accepted by the scientific community today. You just have to make the front page of the NYT, don’t you know?
Having recently downloaded most of Canada’s HAWS (an aggrigate of 23 onshore stations using an anomaly period of 60 years (1951-2010)), Greenland’s (west onshore side, same anomaly period, after a data request to DMI (the just released their 1958-2010 report on daily temperature records, I’ll be looking at the northeast, north, and northwest), and Norway’s Svalbard (and other high latitude island onshore locations for Norway, same anomaly period), I will look on in keen interest as to what the BEST team has to say about high Arctic warming.
What’s different here is that all data is raw daily temperature records, no monthlies, no annuals, which let’s me do some FFT’s (and other filtering techniques) and spot higher harmonics in the temperature record (once the records are made stationary, of course), which dovetails rather nicely with similar work I’ve done with the UIUC sea ice area daily time series. Got to prepare for the upcoming SIO, you know.
My own analyses suggest an overall warming of ~4C since the early 1920’s and an overall warming of ~3C since the early 70’s (most pronounced for the upper Canadian Archipelago (Eureka, Resolute, Mould Bay, Isachsen, and Alert, the original 5 HAWS)). Current warming in that region is ~0.25C/yr +/-0.15C/yr, quite (Dare I say it?) “alarming” actually. The overall pattern is consistant with the overall global SAT, warming from the early 20’s through the mid-40’s, flatline/cooling until the early 70’s, than a very dramatic and apparently accelerating warming period to the present.
The worries about BEST remind me of the story about U. S. Grant before his first battle against the Army of Northern Virginia. He overheard some of his officers speculating about what Lee would do and upbraided them saying he didn’t want to hear about what Lee was going to do to them but rather what they were going to do to Lee.
It is pretty much beyond question that the planet is warming. We have 5 more or less independent datasets that show comparable trends. At its worst, BEST will be wrong. Wrong isn’t so bad. Wrong is correctable–just at UAH was correctable. The arguments we need to worry about are the ones that are “not even wrong”. The people advancing them can’t be persuaded by evidence.
Climate science is complicated, but it seems to me the question is easy. If the earth retains more of the suns heat that heat must either raise temperatures or be used in some other way. Is there any other way that this heat might be used other than in melting ice, that is what we called in high school the heat of fusion?
Comment by Michael Doliner — 2 Mar 2011 @ 10:53 AM
The older model has coarser resolution and less ‘stuff’, and so is much faster. -gavin
My assumption is that resolution (pizza box lat long degrees and number of pizza box layers in the atmosphere) and “stuff” is driven by how much time you’re willing to spend per model run – you mention about a month to model 100 years.
So where does this “month for 100 years” come from? I assume that for AR4, AR5 etc you want to make a certain number of runs (dozens?) and that there are obviously time constraints (AR5 has a schedule) and possibly funding enters into it (you can only buy so many CPU cycles) etc etc.
Care to spend a bit of time fleshing out the details, though? Is it budget driven primarily (can only spend so much on model runs), technology driven (while the next supercomputer generation will always be faster, today’s supercomputers are what you run on today), schedule driven (AR5 deadlines), a combination, or what?
[Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. The fact of the matter is that climate modeling is a continually evolving process – it is never ‘done’. But it takes time to implement, test, and evaluate improvements – a timescale of months to years. So if you are happy with a model configuration now (the parameterisations, set up, experiment etc.), you will almost certainly be unhappy with it in a year’s time (because you will have improved a bunch of stuff in the meantime, found bugs, made the code more efficient, put in new diagnostics etc.). While groups are usually ok about doing long simulations with ‘almost up-to-date’ code, they become less willing to continue experiments with codes that that known to be flawed – especially when the particular error/issue was fixed/improved months before in the development version and the code has been improved to run faster now on some new hardware etc. The rate at which new stuff gets developed, combined with the timescale of developer frustration has been pretty constant for a long time. So groups only very rarely set up simulations that will take more than 6 months/1 year in total (including spin-up, historical runs, future scenarios, equilibration etc.). That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month). – gavin]
51, Gavin in comment: [Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. … That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month).
I want to thank Gavin and other contributors for this interesting discussion. I look forward to more.
Re: Gavin’s inline remarks at #51. Thank you. There are obvious parallels between what you are describing and any formal design process – the capacity for improvement combined with the immediate need for product means that you are never satisfied with today’s model.
Where does this fit in? Do you expect it to have much of an impact?
Comment by One Anonymous Bloke — 2 Mar 2011 @ 3:26 PM
# 49 Janin
here are the interactive maps (australian gov) for Australian cities and predictions of SLR for three scenarios ,
I’ve tried in the past to use both a sigma scheme for pressure and a relative humidity scheme for water vapor. Neither worked particularly well. Recently I put both together, and suddenly I’m getting much more realistic results. Mark that, folks–when writing a radiative-convective model of Earth’s atmosphere, use both a sigma scheme for pressure AND a relative humidity scheme for water vapor. Either by itself won’t help much.
Results of the latest run:
Ts 291.4 K (1.1% error).
Mean stratosphere temperature 227.4 K. Remember how I used to get figures in the 300s and even 400s? I’d like to drop it another 10 K or so, though, if I can do so without cheating.
Albedo 0.321. Too high; I’d be happier with something in the 0.30-0.31 range.
Surface illumination 182.8 W m-2. Trenberth et al. 2009 get 161.2.
Water vapor content 8.95 x 1015 kg. Definitely too low, should be 1.27 x 1016
Tropopause 9.2 km (too low).
Stratopause 46.8 km (darned close).
Top of Atmosphere 272.8 km. I’ve now got a mesosphere as well as a stratosphere, thanks to the sigma scheme.
And, best of all,
Conservation of energy error: 0.1%.
I achieve the latter by accounting for reflection down as well as up.
My next step is to stop parameterizing reflection and try the full equation of radiative transfer with particle effects included for clouds–based on effective radius, and deriving asymmetry factor, single-scattering albedo, etc. Wish me luck…
Gavin, I’m presuming that month per hundred years is proably running on a few hundred processors. The armchair scientist is likely to have only a handfull of cores available, and probably not nearly enough memory for these bigger models to run in even if he had unlimited time.
[Response: Yes, you are correct. But the armchair scientist can usefully run the model version (or resolution) from a a few years back, and might prefer to stuck to relatively uncomplicated atmosphere only simulations without interactive aerosols or chemistry, and think about experiments that don’t need a hundred years of simulation. There are plenty of those. For the ‘fully-loaded’ simulations being done for AR5, you need the massively parallel super-computers though. – gavin]
Oops, sloppy thinking on my part, sinning above my station. Will send a note offsite. My deep apologies.
You all believe science works, but for the guy on the street, if the data are claimed to be resolved in a way that misrepresents reality their “vote” is to not comprehend. What you all are saying helps me understand better and perhaps worry less, but I’m already convinced after years of amateur observation and study at my low level. The denialosphere is gloatingly promoting BEST as the final solution (pun intended). Seems to me a recommendation from Singer is pretty much a guarantee of a certain kind of negativity (better not say too much about that either, as I’m a lightweight and regardless of what I think of his influence, he’s not).
On checking I find the connection with Happer is fairly remote and may be coincidental – a footnote involved with the APS petition.
What is one supposed to do with a greek symbol on captcha?
ianash 58: Dunno if he was trying to be funny or not. In Mullins’ book he says that in the end it doesn’t matter whether Mann’s hockey stick was wrong or not because the IPCC consensus doesn’t depend on Mann’s results. My take away message from Mullins’ book was that (1)he hates Al Gore (2)he’ll do anything he can to discredit Gore but (3)he is a scientist and reality places limits on exactly how far he’ll go. WHen he was hawking his book on the Glenn Beck show he didn’t back down from his position: Global warming is real. In his book he says: By the end of the 21st century it will (if caused by humans) grow enough to be disruptive. He also says words to the effect of don’t trust consensus science which is a damned strange thing to tell a future president. Who should a future president listen to? Mullins? I don’t think so. I personally recommend ignoring Mullins. I highly recommend Burton Richter’s book Beyond Smoke and Mirror’s which, unlike Mullins, discusses climate change and energy without arrogance and without disinformation. I’d also recommend Making Technology Work by Deutsch and Lester waaaay ahead of Mullins.
[Response:sed s/Mullins/Muller/g? – gavin]
Comment by John E. Pearson — 2 Mar 2011 @ 10:26 PM
RE BEST, I feel sorry for my alma mater; it’s probably extremely strapped for cash. Oil REALLY pollutes.
I feel like listening to the Garbage song, which I first heard there over at Berkeley in 1969 — http://www.youtube.com/watch?v=lmD9IJh1Cr0 (sounded better with Steele’s raw and gutteral “garbage, garbage, garbage, garbage”)
[Response: As a UC Berkeley grad myself, I will defend the school. Despite the perhaps misleading name of this project, it in no way, to my knowledge, bears the official imprimatur of UC Berkeley. Indeed, I would imagine that there are quite a few faculty and researchers at UC Berkeley and (LBL which is more closely tied to the project) that are not too happy about the way these institutions are being tied to the project, particularly given the revelations regarding Muller’s for-profit family venture . -mike]
Comment by Lynn Vincentnatnathan — 2 Mar 2011 @ 11:04 PM
and his precis of its findings echoes the NAS’ own press summary
“June 22 — There is sufficient evidence from tree rings, retreating glaciers, and other “proxies” to say with confidence that the last few decades of the 20th century were warmer than any comparable period in the last 400 years, according to a new National Research Council report. There is less confidence in reconstructions of surface temperatures from 1600 back to A.D. 900, and very little confidence in findings on average temperatures before then” http://www.nationalacademies.org/morenews/20060622.html
Some of the BESTies may be suspect, but I agree with Ray Ladbury that people should wait to see what they actually produce, before deciding to condemn them
I was wondering what would happen if one took the various climate models and did a hindcast of 1975 to 2010 and then processed the output temperature data by Tamino’s approach. I assume this can be done. How well would results of climate models agree with each other and the measured temperature data? Would the results tell you anything useful about the accuracy of the climate models?
The thing is that we have 5 global temperature series that all show warming–and warming trends that are quite consistent, especially if you account for volcanism, ENSO, etc. as Tamino has shown. I think one would have to work VERY hard to construct a series that didn’t show warming. And of course we have experience with that. UAH has been modified several times to correct biases and now, while lower than the other series and plagued by systematics, is consistent within errors.
I would be ecstatic if the denialists agreed to be bound by evidence–they would then become skeptics–as evidence can at worst be wrong. Wrong is correctable. Bullshit is irredeemable.
a blogger by the name of Tamino kindly offered some great blog posts of his about volcanoes and GHGs causing most of the early 20th century warming, with solar forcing having a smaller impact. Could anybody point me to some scientific papers elaborating more on the causes of the warming 1910-40? Thnx in advance! – LC
…250 million years ago, about 95 percent of life was wiped out in the sea and 70 percent on land. Researchers at the University of Calgary believe they have discovered evidence to support massive volcanic eruptions burnt significant volumes of coal, producing ash clouds that had broad impact on global oceans.
“This could literally be the smoking gun that explains the latest Permian extinction,” says Dr. Steve Grasby, adjunct professor in the U of C’s geoscience department and research scientist at Natural Resources Canada. …
Go to “Climate at Glance” at the NCDC’s website. Plot the annual mean temp for Texas. The trend for the 1895 to 2010 interval is 0.00 deg F.
The conc of CO2 in dry air has increased by ca 38% since ca 1900.
It this trend continues, doubling the conc of CO2 will have no effect on Texas annual mean temperature.
I really do like Texas cherry pie!
RE: Weather Noise for Feb in Utah is 2 deg F.
I downloaded tbe Utah Feb monthly mean temperature for the interval 1900 to 2010 from the NCDC site just mentioned.
For each decade I computed Tmean +/- AD, where AD is the classical average deviation from the mean. I then computed the average Tmean +/- AD and got 31 +/- 3 deg F.
I propose that AD = weather noise (WN) + resolution of field therometer (RFT). Since RFT = 1 deg F, WN = 2 deg F or ca 1 deg C.
The drawbacks of this methods is that WN is most likely site and season specific and a small sample interval must be used. Nevertheless, the method provides a way to determine a component of natural variability.
Since I got zero comments, I have concluded the guys over there no have curiosity and are brain dead.
I added this comment. This calculation is best done for Tmax and Tmin separatively. It is possible AD might be temp dependent.
Comment by Harold Pierce Jr — 3 Mar 2011 @ 4:36 PM
Gavin — Thanks for the responses regarding model runs; helpful.
As long as we are in to strange, one of the limits on how long a run can go is the blue screen of death, e.g. your run should be a lot shorter than the MTBF for the computer system it is running on. This might be a strong constraint on the amateur systems also
I used to be a supercomputer guy (interactions between algorithms and comp architecture). Still keep up a bit with the field. I’d love to share some of my thoughts on how to take advantage of the evolution in HW that I think is coming in the next several years. If you want to contact me offline, or have a post on how to make future climate codes take maximal advantage of the coming changes, I’d love to contribute to it.
“… water is forced up valley sides to locations of lower pressure, or into ponds in places away from retained heat in rocks, then it will rapidly turn to ice – and can stick to the bottom of the sheet above.
The survey data reveals that this add-on ice makes up 24% of the ice sheet base around Dome A ….”
Also mud –glacially ground up rock, and whatever old sediment remains from times the area was open water — would be pushed up and out around the edges of the ice cap, the same way, I’d think.
46EFS_Junior says:”My own analyses suggest an overall warming of ~4C since the early 1920′s and an overall warming of ~3C since the early 70′s …”
The 1K rise of circa 1922 to circa 1972 of 1K is, at 2K/century, “normal” as a recovery from the LIA. From 1972 to 2011, another 3K rise is 7.7K/century, far beyond the modelled CO2 impact in the Arctic by the CO2 rise since 1975. Something else is going on, Junior: you must be picking up a regional change as, say, in cloud cover or ocean currents. Could the data be that bad?
It is interesting to see what happens when others look at the data. I guess that is why many climate change professionals dislike the activity of bloggers: weird things can show up, but what they mean is not necessarily what they look like. And maybe that accounts for strange adjustments (or what look like “strange” adjustments).
The BEST project will satisfy the skeptics only if the data adjustments are looked at very, very hard. The AGW meme exists only if the global temperature rise since the 1970s has the extra 0.4K of warming the skeptics claim is due to uncorrected UHIE and biased urban/rural data changes. The membership of BEST will not be a problem for the skeptics; the dollar influence is not an issue here. Judith Curry still has a foot in both camps; she is just not as convinced on the certainties in AGW as she was.
On another note, there was a skeptical position about CERES and satellite records from a 2002 and 2008 paper about cloud-based albedo changes from 1988 ti 2002/2008 (Palle and Goode, I believe) that said a 20% decrease in albedo gave more W/m2 than needed by CO2 forcing. But I see nothing anymore at the CERES site. Did this skeptical line of “evidence” fall off the table with more work? It was initiated by the EarthShine project, and then integrated into satellite work.
A couple of days ago there was a “teaser” quiz on a TV news program. The question was asked about Antartica is — growing, stable of shrinking. The right answer was said to be “growing.” When that appeared on the screen one person present asked me how GW could be real if that answer was correct. I have looked on several sites, including this one but have not been able to either confirm the TV answer or find an explanation that if it is true it is a valid piece of evidence against GW. Can anyone help?
[Response: It’s not correct. Antarctica is losing mass in the net. The GRACE data show this clearly (though there is some uncertainty in the absolute magnitude). Conceivably your TV person was thinking about accumulation in the interior which does appear to be increasing (as a function of the increase in water vapour and thus water vapour convergence in the southern polar region). But more ice is being lost at the edges (calving/surface melt/basal melt) than is being gained from increased snow accumulation. – gavin]
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
(Forgive the double post… not sure if CAPCHA worked)
Some passing thoughts. It says that as C02 has risen in the atmosphere the amount and size of stomata on the undersides of leaves has dropped. They mention a figure of 34%. The focus is on a lessening of water vapor released. Kind of seems like a Gaian thing. Since water vapor is a greenhouse gas perhaps the earth is trying to find a way to turn down the coming heat?
On the other hand perhaps the stomata are becoming fewer because plants don’t need as many stoma to gather the amount of C02 they require since more of it is now available?
Another thought comes to mind, if the above is wrong and fewer stoma mean that the plants are actually taking in less C02 for some reason then that would mean more C02 left in the atmosphere. Where perhaps in a normal world this drop in stomata on a per plant basis might be mitigated by a surge in total green growth globally so that on average the same amount of C02 is absorbed, in our modern world with all of the deforestation and development going on perhaps all that carbon will be left to float around the sky making things hot.
When the CRU emails were hacked, he was the only Russian scientist I could find who spoke up and said that this was a “provocation” against the Copenhagen meeting. I have it here (translation at bottom).
Russian Greenpeace reported what he said, but I couldn’t find any big Russian media that reported what this pretty famous Russian expert said. This shows you that there was censorship. A lot of the TV and big media are owned by Gazprom and controlled by the Kremlin.
I could be wrong, of course, because I don’t have an entire science institute housed in a parcel post mailbox like the wizard Bob (Robbert Ferguson) of SPPI.
I am not some big expert, but it doesn’t take a big expert to know that there is something fishy about people whose “institutes” are loated in PO boxes in Virginia malls.
I personally drove to that strip mall and saw that parcel post store, and SPPI is nothing but a mailbox. My GPS (a satellite) figured it out. Finally, I understood why it said the parcel post store was the SPPI.
I felt like Dorothy in the Wizard of Oz pulling back the curtain. SPPI has a lot of articles by Lord Monckton, who doesn’t have a clue. He even appears on the Kremlin-financed Russia Today Satellite channel to debunk climate change.
Being on Russia Today is the closest SPPI people are going to get to a satellite.
“The trails — formed when moisture condenses around aircraft engine exhaust — create cirrus clouds that block solar energy from above and trap heat below. They may be contributing to warming of the Earth’s surface temperature, NASA studies show.”
The article also mentions having aircraft fly lower as a possible solution, but acknowledges that this would increase greenhouse gas emissions by reducing fuel economy.
Geophysical Research Abstracts
Vol. 13, EGU2011-4505-1, 2011
EGU General Assembly 2011
Study of CO2, methane and water vapor sensitivity. Concludes climate sensitivity is 0.41 degrees C per doubling, far less than the average of many other studies. Highly likely they made some questionable assumptions, but I am not a planetary climate modeler. Ray Pierre?
Mark, I’ve only looked into this briefly, and I don’t see anything more available online than a brief description of what he did (an extended abstract). It looks like the guy uses a very simple (and in my opinion useless, though it’s still how you learn about the greenhouse effect in undergrad curriculum) two-layer climate model. He won’t be able to capture any feedbacks correctly if at all. I don’t see this as a useful contribution at all.
Overall conclusion seems to be that the result is in the ballpark (ie similar order of magnitude as more refined calculations), but as Chris Colose said, an interesting approach but ultimately a pointless exercise when there are much better methods for working this out and getting a more likely result.
(One commenter, Eric, drew a parallel: “The classic joke is about physicists oversimplyfing this… ‘consider a spherical cow’.”
Heat (energy) from the Sun may be stored in photosynthesis process or in other chemical processes.
So, say we calculate all the stored energy released by burning of fossil fuels.
How does the energy released/day compare with the energy received each day from the Sun?
Briefly, it goes something like this (caveat — my Russian’s rusty — corrections welcome):
They first talk about the “Rusalka” (Mermaid) experiment/instrument on the International Space Station, which maps CO2 and methane in the atmosphere to get a better idea of the sources. It is said to be unique in the world.
Venus graphics: The narrator says that, if the buildup of greenhouse gases continues, Earth is predicted to turn into a second Venus. (Ouch.)
Cut to Kirpotin, who talks about permafrost melt in Northwest Sibera. He describes a local albedo feedback: the ground used to be covered by snowy white lichens, which give way to browner, darker, more light-absorbing patches as the ground thaws. Above some critical point, he says, the processes of darkening and thawing become self-reinforcing. While peat bogs in the south play a positive role as carbon sinks, the northwest Siberian bogs have become methane emitters, increasing atmospheric concentrations.
The narrator says Siberia has warmed 3 °C in just 40 years. Economic effects are mentioned: the narrator mentions stormier weather, and Kirpotin explains how the winter roads on which the gas industry depends are lasting shorter and shorter. It ends with the narrator wondering aloud whether the main culprit is man or nature (ouch), and saying that scientists disagree, but stressing that climate change is real and GHGs are increasing.
Something like that, anyway. Pretty good reporting, I think, except for the Venus scenario and the bit of faux balance at the end.
If you want background, Google Scholar for Kirpotin + permafrost, thermokarst, methane, or similar should do it.
CCPO–I have posted the translation below–more or less. I don’t know every word, but I get the idea.
Here is a post that explains the video and has a Russian explanation at the bottom of my post. This satellite is called Rusalka–Mermaid. It’s a Russian fairy-tale character and an acronym–see my post for details and documentation.
This post also has Kirpotin in a different with English subtitles discussing the permafrost. If I ever get to Tomsk, I will be sure to look him up!
Here is a translation of the gist of the video about Rusalka–Google translation from Vesti, which had the same video (more or less).
BEGIN QUOTE (or see this on my post with Google translation into English):
On the International Space Station began an experiment under the fabulous name “Mermaid” . Purpose – to measure greenhouse gases in the Earth’s atmosphere. Methane, carbon dioxide and water vapor have real impact on global warming.
“Mermaid” – an acronym. Full name of the device – Hand spectrum analyzer components of the atmosphere. Exactly the same now in orbit. With the help of astronauts will photograph the Earth’s surface. The Institute for Space Studies, said analogs “Mermaid” in the world no.
“The problem is not something to measure the total content of CO2, it is measured. The task to identify the sources and origins of these gases” – explains technical manager of “The Mermaid” Alexander Trokhimovskii.
A similar device today hovers above Venus. There’s greenhouse gas emissions in excess, the real space “bath”: a thick veil of clouds and the temperature of hundreds of degrees. According to forecasts, if the accumulation of greenhouse gases in Earth’s atmosphere does not stop, our planet could become a second Venus.
“Icy surface of the swamp has recently been covered in very light, almost white lichen. We have assumed that as a result of climate warming of the surfaces of dark brown or lakes, which also absorb the sun’s rays will be more and more increasing. And there are some critical threshold and dark brown surfaces, at which the process of thawing permafrost will spur himself. The more it becomes dark, the faster the melting – tells about the mechanisms leading to the greenhouse effect, Sc.D., vice-rector of Tomsk State University Sergei Kirpotin. – It is evident that this was originally a white surface occupied by lichens, which are like a snow-white snow reflects the sun’s rays, are beginning to rapidly emerge young lake. ”
Biologists at Tomsk State University has long alarmed. Global warming – not a fictional story. Examples underfoot: the swamp. In Western Siberia, it’s almost half the territory. Here and in the largest swamp in the world – Vasyugan.
Here it is, the heart of Siberia, swamps. Without special protection from mosquitoes and Bolotnikov nothing to do here. Before you go here, we were warned: just walk on the grass in the center of the trail is not offensive – it will tighten.
In the climate regulation marshes play a major role. Scientists call them “refrigerators – coolers. Plants absorb carbon dioxide from the atmosphere, turn into peat, and a greenhouse gas, “preserved.” In the south western Siberia bog in such a scenario and “live”, but in the north, the permafrost zone, the situation recently changed for the worse.
“So I jumped and starts to separate gas, methane, the strongest greenhouse gas – is experimenting in the middle of the swamp Sergei Kirpotin.-As a result of a landslide that began thawing of permafrost peatlands in the area, the atmosphere starts to separate large amounts of methane. Unfortunately, this gas has 30 times more powerful greenhouse gas than carbon dioxide. And today, on the scales two processes: the positive role of wetlands in the south, as cooler air, and their negative role in the north of the West Siberian Plain.
The expedition to the swamps local environmentalists go several times a year. According to the observations, the melting so rapidly that it is impossible to stop him. In Siberia is warming faster than anywhere else on the planet. For forty years the temperature has risen by three degrees. Violent storms, which had never been here, today weather report. Many are also linked with global warming. These three degrees cause damage and the economy.
“Today, in some cases, winter roads are beginning to operate from late December, and even after the New Year. Can you imagine what progress over time?” This is a huge loss for the oil and gas industry, which is a significant part of their activities provides precisely such a winter roads “- counts economic losses from global warming to the economy of the region rector of Tomsk State University.
Who are the main culprit of climate change on Earth, and producer of greenhouse gases: a man with his plant and machinery or the planet itself? It is known that the change in climate for the Earth – a natural process. Were in history and glacial periods, and tropical heat. Should I worry? Scientists opinions diverge. However, one sure: the content of greenhouse gases in the atmosphere increases, and the planet’s climate is changing.
I think they have to say that wishy-washy bit at the end because global warming is politically sensitive because of the powerful gas industry. I have read a lot of Kirpotin, and he isn’t wishy-washy about what is happening.
> So, say we calculate all the stored energy released by burning of fossil fuels.
> How does the energy released/day compare with the energy received each day from the Sun?
Minuscule — four orders of magnitude less. The problem isn’t the heat energy from the burning of fossil fuels. It’s the added CO2, which means the world has to get warmer to radiate away as much solar energy as it receives. A doubling of CO2 gives a nearly 4 W/m2 forcing. The greenhouse heating from fossil-fuel emissions is two orders of magnitude greater than the direct heat of their combustion.
Regarding the upcoming results of the BEST Project, it will be interesting to see.
Confirmation or not by the BEST Project of the results of previous GST time series analyses, the BEST Project team promises to be more open than providers of past analyses. We see their clear message infers the promise of more openness with the data, methodology, code and related documents without the need for FOIs or being diverted by institutional and/or beaucratic data & code keepers. We shall see soon enough. If the promise of being more open than the past analyses is kept, that by itself, is a step forward on the GST time series science independent of the results.
John Whitman: How can you be more open than publishing the entire dataset and codebase? How can you make data turn around and say the opposite of what it represents?
I think these are far more relevant questions than yours.
All this self-serving nonsense about “openness” makes me sick. Hypocrites, begone!
It’s all just so tiresome. When BEST fails to produce anything of merit, the deniers will dream up some other project, all in the name of delay, delay, delay. Oh, sorry. In the name of “openness and sweetness and light and keeping the bad, bad scientist all honest and nice and open and nice and being absolutely certain and maybe we’re still just not quite certain and maybe not quite open enough and maybe we should try again because maybe we got the wrong answer the last dozen times and maybe we can drag this out just a little bit longer because open open maybe open open honest barf squiggle”.
[Response: Yes, it’s a completely skewed and false representation of the issue. You can take any science and make all kinds of noise about how important data are “hidden”, and furthermore, when that ruse is used up, you can formulate all manner of accusations about “data quality” issues, and then of course there’s all the supposed analytical issues; and round and round you go…Jim ]
Well, one important thing to bear in mind. It only takes a very short time (months) to analyse the global surface temperature record. If BEST takes longer than this to publish something preliminary, then they are delaying (much like that surface stations malarkey).
“I think these are far more relevant questions than yours.”
– – – – – –
Thanks for your response.
We will see very soon by direct comparison whether the openness behavior of the BEST Project compares favorably to previous openness approaches to GST time series analyses/products. We each can do our own evaluations on openess independently. If a new benchmark does happen to be established by the BEST Project Team, then all other previous teams producing GST time series analyses/products will benefit. We live in interesting times because of these possibilities in advancing climate science beyond its current state.
Kent Clizbe, that “ex-CIA operative” who emailed Dr. Mann’s co-workers and encouraged them to denounce him, has written an article on his blog smearing Dr. Christopher Field. [Clizbe is off-message because the real CIA has a Center for Climate Change and National Security whih is reportedly led by Larry Kobayashi.]
Now I see that Dr. Field is testifying in Congress.
Perhaps it is no coincidence that Dr. Field was vilified in very crude language (“his snout in the trough,” and “sucking off…the government teats”) right before his testimony. This is the kind of kompromat I am used to seeing in the Russian/Soviet media when a scientist is attacked. It reminds me of how they attacked Sakharov. It reminds me of how the Soviets accused the Pentagon scientists of making AIDS.
If I ran the CIA, I would make a statement that Clizbe’s views should not be associated with the CIA.
Kent Clizbe (2-22-11) states on his blog post:
“An initial review of Field’s background and snout in the trough seems identical to any number of “climate scientists” (Field’s scientific background is Biological Sciences. His PhD research was on Leaf Aging in a California Shrub) sucking off the National Academy of Sciences, NASA, NOAA, and other government teats.”
Kent Clizbe seems to be implying that Dr. Field doesn’t have the credentials to be a climate scientist, but scientists who study climate change come from many different fields.
Good CIA officers check their facts. The former spy Kent Clizbe doesn’t even manage to ferret out the fact that the National Academy of Sciences is not a federal agency. It is a federally-chartered corporation that advises federal agencies.
A revolution is underway according to Roger Pielke Sr. New research has been published which “may change how we view the climate system”. A bold claim indeed. Where has this incredibly important, paradigm shifting research been published? Er, on Judy Curry’s blog. Riiiiiight.
Anyone who wants two years’ worth of someone else’s emails should be willing to make two years’ worth of their own emails public, IMO. The process of making emails public from only one side is inherently biased.
We will see very soon by direct comparison whether the openness behavior of the BEST Project compares favorably to previous openness approaches to GST time series analyses/products.
Spoken like someone who parrots what one has heard, rather than checked for himself. You were given links to code and data.
The BEST project people will, if they use that tiny portion of national records made available only under license by the various countries involved, have to obey the terms of that license just like CRU. Yet I’m sure the denialist community will give them a free pass if they do.
Something like 95% of the data used by CRU is available at GHCN, and GISTemp only uses the publicly available data. If you want the raw data from GHCN, you can acquire DVDs of scans of the original paper documents – handy if you think the conspiracy runs so deep that people have been altering the data when it’s transcribed from paper to data files.
The source to GISTemp is available, has been rewritten in Python by interested software engineers (and given back to NASA), and several independent people have applied their own statistical analysis to the data (coming up with essentially the same trends as GISTtemp, in the process).
Will BEST compare favorably? Well, if you mean … will BEST be *as open* as NASA GISS, then that’s an open question. I haven’t seen the BEST people promise to release their source code (as NASA does), for instance – just the data and algorithm descriptions.
BEST probably *will* compare favorably to the denialist’s favorite temperature reconstruction – Christy and Spencer’s UAH satellite reconstruction efforts. So does GISTemp.
Because Spencer and Christy haven’t open-sourced their code.
Here’s a question: why aren’t you attacking them for their lack of openness?
It looks like the effort of a scientifically literate person to find reasons to believe that climate sensitivity is low when his real reasons are ideological. Something the intelligent voter should not be fooled by but it could easilly fool someone who knows a fair amount of science but has not looked at this problem in detail
However looking at more recent CET (Ave1) data, and using three different time spans (10, 20 & 50 year or 0.1, 0.05 0.02 cycles/yr) filters, a different pattern is seen. In the period over about the past 10 years, the CET series seems to be fairly flat, or even a trend down.
It has a couple of interesting features, including a gain a flattening over roughly, last 8 years. This latest apparent dip, is similar to the one in the 1040’s, and with some longer period filters, similar to the one in the 1875-1885 period.
J Bob, # 118: Could you supply a working link for that Tamino post about central England temperatures? (BTW, I think Tamino’s blog history underwent some kind of reorganization a while back, as some of my casual attempts to find old posts have failed. Offhand, not sure what’s up with that.)
Tamino usually analyzes past data, and is sparing with predictions.
Even for the whole globe, 10 years is a very short time in the surface temperature record. Absolutely no one (with any sense) would conclude anything important from your “fairly flat” claim about CET for 10 years, over such a tiny area.
I hope the previous comment helps you find the post.
Re: #118 (J. Bob)
No, memory does not serve you right. I made no prediction. The closest I came was a statement starting with “If this trend continues, then …” You seem to have very conveniently ignored the “if” part.
As for your “analysis,” in my humble opinion it’s worthless. You’re just another of the “fun-with-Fourier” people who ignores the science called statistics. In fact I did a post about your (and others’) approach, which I refer to as mathturbation.
I don’t think there is a huge difference in 2050 compared to the article you link to, but there might be a substantial difference around 2100. If a logistic growth model is applicable, the exponential form for early sea level rise may be better than the quadratic form.
Skeptic site Climate4you is publishes this graph of average OLR, and asks why OLR does not decrease to increasing CO2. While Huang, Yi, V. Ramaswamy, 2009, show a more complex picture, I am somewhat surprized that there isnt a trend in OLR since 1974. What is wrong with this picture.
[Response: Link doesn’t work. But there is no measuring device over that time period that can give a reliable trend – so depending on where the data comes from it will have issues. – gavin]
Thank you for the link. I was just wondering why Hansen et. al. were using exponential fit rather than polynomial fit for the Grace data. As you mention it could be a big difference in the long run.
I understand that ice sheets are not well modelled yet, and that the Grace dataseries is too short for definitive arguments.
I think the paper gave some arguments for his model.
Am I right in understanding Hansens main argument for (a potential) exponential mass loss to be feedbacks from albedo-flip on the borders of the ice-sheet, lowering of the level of the ice and growing forcing from higher temperatures of the sea and air?
I have a friend who gets all his climate science from The Spectator and is always going on about Vikings in Greenland. I know this is complete BS and I know I’ve asked this before, but is there a site somewhere that explains why this is BS? I can’t recall whether it’s BS because the Vikings didn’t have much in the way of agriculture there anyway, or because the melt in Greenland now is much more extensive than in the MWP, or what it was.
Sorry, link is here. The site claims data is from noaa. This NOOA pages says temporal coverage from 1974/06.
[Response: it’s almost certainly from the NCEP reanalysis. The trends are corrupted by changes in the observing network and uncorrected biases in obs which makes these trends untrustworthy. If you look at the ERA-Interim, I’m sure it would look very different. – gavin]
Tamino, since your post was not up, I didn’t catch the “if” clause, but does that mean that you predicted the upward temperature trend?
However you comments ‘ ’fun-with-Fourier’ people who ignores the science called statistics.”, are most interesting. Enclosed is a comparison graph between the Empirical Mode Decomposition (EMD) & Fourier filtering, using the CRU data set from 1856 to 2004. In my humble opinion, it’s pretty good match, considering the EMD method is supposed to work for non-linear & non-stationary systems.
Since you seem to be into statistics, why don’t we add a little “stats” to the Fourier filtering process. Using the HadCRUT3gl data set, with three types of lo-pass filters. The MOV, forward-reverse “filtfilt” 2-pole Chev. & Fourier Convolution filter (Fc=0.1 cycles/year), one gets the upper graph on the picture below.
Looking at the lower graph, the difference (cyan curve), between the raw and filtered Fourier data, the filtered is pretty well centered in the middle of the input data. But since your into statistics, the mean and Std. Dev for this difference is labeled. Assuming a Gaussian distribution, that would give a 3 sigma filter error of about 0.32 deg.
In my humble estimation, it does a very good job estimating the longer term input data. Maybe you could could do better?
Hansen used albedo flip to neatly explain the timing of Milankovich cycles as having glacial retreat more tied to spring rather than summer NH warming. And, it is a positive feedback, but his main argument so far for the existence of a short doubling time in ice sheet mass loss is paleo-records of very rapid increases in sea level rise. And, he has argued that there is plenty of forcing to melt ice sheets rapidly.
There has been recent work on heat transfer from the surface of ice sheets to the interior on a decade timescale that might set a doubling time for mass loss owing to the changed structural integrity of the ice. http://adsabs.harvard.edu/abs/2010AGUFM.C11B..02R
But, I have not seen Hansen reference this work though I think he has something similar but less worked out in mind.
“What got my attention was a 2010 paper (I usually stay away from physics) titled “Drawing an elephant with four complex parameters” where the authors decided to try to reconstruct an elephant with four complex numbers and wiggle its trunk with a fifth ! This was attempted earlier in 1975 with least-squares Fourier series fitting but apparently that took 30 parameters. The figures in the paper show that they managed to do something close. http://2.bp.blogspot.com/-Xr2YfZrYDMM/TWdnBa6VzDI/AAAAAAAAD1M/cbOMZHKXQ8Q/s200/4points.png
My elephant clearly beats the math
I decided to write up a program and look at it for myself and have been able to get a passable elephant only with 8 parameters and that is 8 sets of 4 real numbers (a b c and d of Kuhl and Giardina which can be considered as two complex numbers) and so not as good as the results of the authors of the 2010 paper. Maybe my elephant is too complex for the math. In case you want to play around with the idea download (Windows unfortunately, but works in Wine) the small program that I wrote today from here – you can also find Matlab code here.” [working links in the original blog page]
To J. Bob: You just don’t get it. It’s well said that you’re “not even wrong.”
To those whose hope to understand time series data:
Time series analysis is not a contest to see who can fit a smooth curve to the data which leaves the smallest residuals. Any child can take that process to its limit, simply by fitting some curve with enough parameters to match the data exactly. The residuals are all zero but obviously such a curve is meaningless, a classic case of “overfitting.” It’s not a contest to minimize the residuals within some constraint of what one considers “smoothness” (like the cutoff frequency for a low-pass filter). Beware exercises in indulgent curve-fitting, aptly named “mathturbation.”
Nor is it an excuse to indulge in confirmation bias, contorting the data to resemble one’s preconception of what the result should “look like.” Alas, this is all too common a sin (even among the otherwise highly qualified), usually committed by those who wish to proclaim that a trend has turned in some favored direction.
It’s a sure sign of ignorance (and often of insecurity) to brandish technical terminology like a weapon when, to those in the know, it’s obvious one is trying to tighten a screw with a drill bit. Especially in a forum in which many or most are not mathematicians or statisticians, showing off how many filters you can name and referring to a simple moving average as “MOV” rather than “a simple moving average,” is the sort of arrant pedantry up with which we should not put. Be not impressed, intimidated, or persuaded by such displays. Real genius has a gift for clarity, a prize which is rightly cherished both for its beauty and its rarity.
If you want to do time series analysis, take the time to learn time series analysis. Familiarity with signal processing, and reading “How to Lie with Statistics” (however entertaining and informative it may be), do not suffice. Show some respect (and humility) approaching the subject. It’s not something you can learn overnight, getting good at it takes longer still, and no, you can’t just fly by the seat of your pants based on your incredible genius. Work at it, you may be well rewarded. And believe it or not, a lot of really smart people have already discovered a lot about the subject — you’d be well advised to learn from them.
One of the most basic and useful goals is to understand, from studying data, the nature of what we might call the signal (which is, basically, the deterministic part of our data) and what we might call noise (which is, basically, the random part). The two behave differently — dramatically so — in diverse and intricate ways, but teasing them apart is far from easy. Yet doing so, when drawing conclusions about what trends may exist in hopes of a glimpse of the future, is often essential. No, the curve you’ve fit isn’t the signal and the residuals are not the noise. You’re damn lucky if they’re even a decent approximation.
It’s just too easy (and common) for noise rather than signal to rule your model-building exercises, and in consequence for you to convince yourself that you’ve hit the bulls-eye when you’re not even in the target. Fitting models is too damn easy, especially with multiple parameters to adjust, while finding reality is hard — so much so that John von Neumann responded to a colleague’s claim about the significance of a result with a now famous quip, that “with four parameters I can fit an elephant and with five I can make him wiggle his trunk.”
Finally, don’t fall in love with statistical models when studying physics. Nature usually refuses slavishly to follow simple models which are statistically tractable. As much as I love statistics, and time series analysis, when it comes to the physical world, physics rules.
#137 tamino says,
“when it comes to the physical world, physics rules”. Unfortunately we do not know all the subtle details of the rules. Speaking of “time series data”, what in heaven’s name do most real time control systems use? The next time you are in a aircraft, your future might depend on one of the systems I helped design.
But back to the point, if you are going to predict the future, one had better have a good picture of where one came from, and is, (Norbert Wiener). The example presented is a way to illustrate estimation of real world status. And also a method to determine it’s figure of merit, in relationship to global temperature.
So, just present your work, a single picture would be preferable, and let the readers decide.
> in a aircraft, your future might depend on
> one of the systems I helped design.
We know, with the aircraft systems in use, that the designers have failed to cover the range of problems that happen. Oddly, that’s the same sort of problem you seem to be having understanding the problems being warned of by climate science.
Who’d’a thunkit, eh?
At work, you can blame your managers who told you the design limits to consider.
Here, you should be listening to people who are warning you that the limits are changing.
“It is widely accepted in the industry that the most recurrent feature of most large-airplane commercial air accidents worldwide in the last few years has been loss of control…. Since loss of control is now prominent amongst probable causal factors of accidents, it seems to me obviously worthwhile to perform this research…. the NTSB’s concern in the Denver report is with situations that could be veridically modeled in flight simulators but currently are not. That could be, and probably should be, fixed.”
JBob says we don’t know all the rules. Quite so. However, that does not mean we should start by ignoring the rules we do know–like the fact that CO2 is a greenhouse gas that will cause the temperature to rise (most likely) somewhere between 2 and 4.5 degrees per doubling. That simple fact alone is sufficient to account for the upward trend in temperature–no recourse to putative “oscillations” needed.
What is more, CO2 explains not just the global temperature increase, but how the increase is distributed spatially and over time (daily, annual), as well as stratospheric cooling. By ignoring the physics, you open yourself up to seeing all sorts of patterns that are not there.
#139 Hank, #140 Ray
my point to was that I’m well aware of time series analysis, having been involved in A/C and space vehicles, starting with the X-15. However the point is that, tamino has a problem with the graphs I presented. So all he has to do is show a comparison on the HadCRUT3gl raw & smoothed data set, and see who might have the better potential for extrapolating forward.
Bringing up Denver A/P, & Ray talking about CO2 is not the issue.
[Response: Ok enough. It is clear that the difference between curve fitting and future predictions are completely lost on you, and everyone has given up trying to educate you on this. (Clue: if the degree to which a historical curve is fitted by arbitrary functions was the key to predictability we’d just have gone straight to high order splines and we’d be done. Since we haven’t done that, perhaps you might want to think about why). In any case, no more on it here. – gavin]
Does anybody know a good reference on the Composite-plus-Scale method for combining proxies? I have only seen descriptions with lots of word (e.g. Ljungqvist, 2010, which however is a good summary of the method).
I would like to see a description with lots of equations, especially on how to determine the standard deviation for the reconstruction.
Ljungqvist 2010 does not describe that part in details, and I have not been able to reproduce his errorbars for the reconstruction (but I was able to replicate the reconstruction it self, as well as fig 2.).
Journalism question here. Suppose you’re interviewing someone who’s a climate doubter, & after he says climate science became politicized (which would mean there’s pressure to ‘conform’), you ask him why he thinks this would have happened, & he says, more or less, that the climatologists have fallen sway(?) to the lure of power*.
What’s your follow-up question?
With the benefit of several hours of hindsight, I think maybe it’s “exactly what kind of power do they now have, and do they seem to be enjoying it?”; but maybe there’d be a better one?
* caveat: I think I have the gist right, but have not yet transcribed.
…or maybe (continuing my previous comment) it’s (re “politicization”) “ok then, what evidence has been wrongly kept from publication in reputable journals?” (& if precluded there, did it go to E&E instead, & if so, why doesn’t it look convincing there?)
Anna #144, 145, If this person is worth interviewing, if they are in any way notable, I think the story is: “Noteworthy person demonstrates own stupidity, makes ridiculous allegations, has egg on face.” Anything else is false balance.
Comment by One Anonymous Bloke — 11 Mar 2011 @ 7:14 PM
perhaps it would be a good idea to read some of Wiener’s work, before you get to critical about Fourier methods, signal conditioning and predicting future results. The methods I was using would be more correctly called signal conditioning.
Please can anyone help point me in the right direction? Are we expected to see an increase in earthquakes (magnitude or frequency) due to sea-level rise? All that extra water sloshing about etc. Wild speculation on my part, searching Google scholar hasn’t turned up anything…
Comment by One Anonymous Bloke — 11 Mar 2011 @ 9:55 PM
Excellent post here on adaptation and agriculture, from a longterm excellent weblog:
#147 that Google search hasn’t turned up anything is of no wonder since it’s wild speculation anyway (though some scientists have done it also), there was some studies of the relation of the ice melt and earthquakes of Greenland. If there’s an effect that would be IMHO the frequency as the water is distributed evenly… there, resorted to wild speculation myself.
Hmm. Well, we will be redistributing mass from the poles to the equator. This will not only result in isostatic adjustment, but will even change the planet’s moment of inertia and therefore its period of rotation (aka length of day). However, these are generally small effects, and we know from post-glacial rebound that the planet tends to respond with many small quakes as opposed one one biggie. It’s not one of the consequences of climate change that is likely to keep me awake at night.
Ray Ladbury, Anonymous, thanks. I figured it would have some effect.
Comment by One Anonymous Bloke — 12 Mar 2011 @ 12:26 PM
I have a fundamental question that is bothering me and I am hoping the climate scientists here can provide an authoritative response.
If one looks at the global temperature record from approximately the 1940s through the 1970s the global temperature was generally decreasing even though the concentration of CO2 was generally increasing (ignoring minor seasonal fluctuations). Some time around the 1970s this global temperature trend reversed and has been generally increasing ever since and again the concentration of CO2 has also increasing during this time (per the Keeling Curve).
Can you please explain why the global temperatures were decreasing from the 1940s through the 1970s even though CO2 concentrations were increasing during that period? Also, is there an consensus scientific argument as to why the global temperature trend reversed in the 1970s, or so? What are the fundamental factors that came into play to explain this reversal?
The period from the 1940s through the 1970s would seem to indicate, at least to me as a lay person, that the correlation between CO2 concentrations and global temperature is a weak one at best. What is the consensus scientific view on why this is not the case?
Returning to “seduced by being close to power” – OAB#47 I share your assessment, but I still want to know what one would/could ask or say, that would make this clear to a reader who’s not familiar with how science works. Yes we have Peter Watt’s most excellent “science is rugby” explanatory post and others of that ilk, but how does Joe Reader know to accord credibility to them rather than to those saying the opposite?
(speaking of Watts, just finished Blindsight and it was mighty fine. eek.)
MrRight@159 – With some serious reading this site can clear your confusion. Start here [notice the date]. Beyond that, plug “1940s through the 1970s” into the site search and fill yer boots, it’s been dealt with many times.
The “weak correlation with CO2” argument is based on the misconception that greenhouse gases are the only factor which influences global temperature. Mainstream climate scientists (like those who run this blog) have never made such a claim.
Instead of ignoring all the other factors, see this:
Anna Haynes #160 “how does Joe Reader know to accord credibility to them rather than to those saying the opposite?”
Stick to the facts. Let the audience make up its own mind.
I’m not familiar with the “science is rugby” analogy; please post a link.
Comment by One Anonymous Bloke — 12 Mar 2011 @ 10:45 PM
OAB#164, for Peter Watts on science as rugby reference: at bit.ly/w101search search for “Peter Watts” science rugby
…and with 1 layer of indirection, get the URL which is: http://www.rifters.com/crawl/?p=886
I will just add that, regarding The Climate Science “Debate”, there are limits to how far a lay person can follow the actual science. However, a thoughtful observer will note that more often than not, deniers will soon turn to rhetorical devices and specious arguments whereas qualified scientists will doggedly attempt to explain the science. That’s one clue to help you weight your approach on a subject you don’t understand.
Another clue is to understand the way science works. It has methods that, while not perfect, are the best that humankind has been able to devise for determining the way things work and are likely to behave. The vast weight of scientific thinking says that AGW is real. You would be hard pressed to find a credible scientific body anywhere in the world that doesn’t agree with this.
To believably accuse thousands of highly competitive climate scientists of participating in what amounts to a perfidious, self-serving monster conspiracy requires evidence (real evidence, not smears). None has been produced so far. None. On the other hand, note what has been unearthed about deniers, for instance in connection with the tobacco industry just for starters.
A good journalist is resourceful. It may be one fact that “he said”, and another fact that “she said”, but letting it go at that is just a lazy journalist’s way to avoid dealing with a difficult subject; a subject, I might add, that will be sure to offend certain powerful forces bent on the promotion of ignorance.
The real question is, as a journalist, what is your job and what are you made of?
Radge #167 (“To believably accuse thousands of highly competitive climate scientists of participating in what amounts to a perfidious, self-serving monster conspiracy requires evidence…”)
…and the evidence provided was of the unverifiable “the lurkers support me in email” flavor.
(fyi, haven’t gotten an answer to my followup (emailed) q asking whether the #1 (anon) holdout he recounted was actually a climate scientist.)
With more hindsight, I think I should have asked “do you think that this “seduction”& cowing of scientists biases what climate change evidence gets published, or that it biases the interpretation of the evidence?” If he said the former, then ask what evidence was being suppressed & where I could find it, e.g. in E&E. If the latter, ask which evidence has an better, alternative interpretation.
I’ll try asking, in email.
(and if the response is “they’re afraid to publish it period, even in E&E”, the counter is “well, there’s a historic trajectory, since the preponderance of views has changed; so surely at some point E&E was getting the “becoming unpopular but not yet outre’ ” findings?)
One meta-observation, re the “the lurkers support me in email” style argument – it is probably a real phenomenon, in that as a position becomes less and less justifiable, those still holding it will want to do so less and less publicly; or to turn that on its causal head, only those who keep their views hidden will have successfully protected them from exposure to counterargument.
The more I think about this, the cleverer it seems – take the “network of trust” concept many of us have been promoting, and turn it 180 degrees.
The #1 objection though is, you’d expect some of the scientists with tenure would have had the cojones to speak up, if it were for real. (Solution to that objection: work to abolish tenure?)
Bibasir, the report I saw said that yes it is permanent. It will speed earth’s rotation on its axis by a bit more than a millisecond. So we’ll need to reset our clocks by 1 second every few centuries or so.
It is very slight. I doubt a shift of 25cm will have any measurable impact on albedo or insolation.
Mine would be “do you know any climatologists personally?”
…and if the answer is “no” (as I’m pretty sure it is), offer to fix that.
Just like with the silly prejudices of racism, this kind of paranoid illusion doesn’t survive the reality of getting to know one of the accused on a personal basis. It’s so easy to hate a cardboard figure.
You know, what some of these folks need is a “Great Big Strawman”–something that would be just more fun for them to debate, something that wouldn’t actually put the rest of us at risk. The resistance to logic, the selective blindess, are a sure clue that emotional need is driving a lot of the argumentation.
(For example, I’m quite sure that a lot of the folks who seize on the “man’s contribution of CO2 is so tiny meme”–currently under discussion at Tamino’s “Open Mind” blog–are quite capable in their private lives of understanding that taking on a debt to the tune of 3% of their annual gross income isn’t a good idea unless they are running a bit of a surplus. But the analogous situation in terms of the carbon cycle ‘budget’ seems utterly incomprehensible to them.)
But I suppose the Great Big Strawman idea is impractical. It wouldn’t be easy to concoct just the right ‘conspiracy,’ and you’d still have to fight the media headwind from Fox, Imhofe, WUWT and the like.
When you look at the denialists in the ranks of actual climate scientists, you’ve got Lindzen, then Spencer running a distant second, and then you hit the bottom of the barrel, hard, with clowns like S.F. Singer. Now as I understand it you climate scientists are just a bunch of grant whores willing to say anything for money. At least that’s what Lindzen, Spencer, and last and least S.F. SInger say.
So my question is this.
Who is the purchasing agent for the fossil fuel industry when it wants to purchase some climate scientists?
The FF industry makes on the order of a billion dollars per day. I’m supposed to believe that out of the enormous abundance of money grubbing grant mongering climate scientists they can’t do any better than Lindzen, Spencer, and ha ha hoohee ha ha S.F. Singer? Someone should tell the FF people that they should probably be taking a very close look at their purchasing agent’s personal bank accounts. I would suggest they look off shore too. I think they’re getting taken.
Comment by John E. Pearson — 14 Mar 2011 @ 9:26 AM
John #175. They’re not buying scientists, or even science, they’re buying politicians, airtime and column inches.
Comment by One Anonymous Bloke — 14 Mar 2011 @ 4:00 PM
John E. Pearson asked: “Who is the purchasing agent for the fossil fuel industry when it wants to purchase some climate scientists?”
Heartland Institute, American Enterprise Institute, CATO, and various other corporate propaganda mills masquerading as “conservative think tanks”.
in your response comment #142, 11 Mar. @ 4:11 PM, you mentioned ,
“It is clear that the difference between curve fitting and future predictions are completely lost on you”,
I though I’d follow up with a little more detail, to perhaps remove some mis-interpretations.
The first mistake is that the Fourier convolution is curve fitting. Some may look at it that way, but it is a mathematical transformation from the time (or length), to and from, the frequency domain. Somewhat like the Laplace and Z transform, used in continuous and discrete control systems analysis. The only thing the analyst has control over, for this type of filtering, is masking the frequency components, like setting the span of a average.
From the time to frequency domain Fourier transformation, one obtains the Power Spectra, which can then be used to implement the Wiener-Kolmogorov filter. This filter optimizes linear system noise reduction for the present, and is a prerequsitie for future signal prediction. Being able to predict future states is also of prime importance on control system design, as in many cases, it is necessary to preserve system stability. For more advanced systems, the Kalman filter is often used. Being able to predict a future state, in many real time applications, is the reason why the FFT is incorporated in signal processing IC’s. In this case it continually updates, current and future state of the system.
The IEEE (Institute of Electrical & Electronic Engineers) has the Signal Processing and Control Systems Societies, which I used to be a member, can provide a number of good publications, with wide ranging applications, on this subject.
The IEEE (Institute of Electrical & Electronic Engineers) has the Signal Processing
And why do they have the Signal Processing processing society, rather than the Signal Detection society?
You’re totally misunderstanding the math behind Fourier and related transforms compared to statistical analysis.
Of course, if you’re right, all of statistical analysis (not only that related to climate science, but the foundational maths based on rigorous proof, etc) must be tossed into the trash, and the proofs underlying Signal Processing must be extended on faith far beyond their mathematical foundations.
It is NOT that we don’t understand Fourier analysis–it is that we understand it is not applicable in all cases. Consider the following series:
You would find a periodicity in the series and perhaps predict the next value would be 1 or 2. I would look at the series, realize that the y values are the digits of the base of Napierian logarithms and sah, “Four.”
Fourier analysis is EXACTLY curve fitting. That does not mean you cannot SOMETIMES derive understanding from it. However, you could use any complete set of functions and do the same thing.
The problem is not seeing patterns. The human brain is great at spotting patterns whether they are there or not. The trick is distinguishing the patterns that are real. Fourier analysis doesn’t help you do that. It just helps you spot them. To see if they are real, you need rigorous statistical analysis and PHYSICS!
Re my “followup Qs?” advice request in #144 above (link) & the feedback from y’all, the gist of the Q&A – with GMU science literacy advocate James Trefil – went here (link) (which has a reference to the full Q&A transcript). No answers to my followup Qs, but he hadn’t promised any.
Phil Jones — The [World Meteorological] organisation wanted a relatively simple diagram for their particular audience. What we started off doing was the three series with the instrumental temperatures on the end, clearly differentiated from the tree ring series, but they thought that was too complicated to explain to their audience. So, so what we did was just to add them on and bring them up to the present. And, as I say, this was a World Meteorological Organisation statement. It had hardly any coverage in the media at the time, and had virtually no coverage for the next ten years, until the release of the emails.
Paul Nurse — So why do you think so much fuss was made about the emails and this graph, rather than the peer reviewed science?
Phil Jones — I think it’s that the number of climate change sceptics, or doubters, deniers, whatever you want to call them, just wanted to use these emails for their own purposes to cast doubt on the basic science. The basic science is in the peer reviewed literature, and I wish more people would read that than read the emails.
I think nobody does themselves any favours by labelling skeptics as deniers? After all, arent scientists by their very nature supposed to be skeptical? Please refrain from using this term as it does neither side of the debate any good!
As my name suggests, I’m confused. Most scientists seem to agree that the climate is warming. Some could argue that it is part of the 800 year cycle and maybe we are making it worse.
As I understand, we produce CO2 at levels that are about a quarter of the CO2 that cattle create. I’m confused as to why we concentrate on man-made industrial emissions, which are extremely difficult to reduce, when we would have more chance if we focused on reducing the number of cattle.
A 25% reduction in cattle would eliminate the effect of all of our CO2 production.
In addition, I understand that cattle produce large quantities of methane, a greenhous gas 200 times worse that CO2 and are also responsible for the deforestation of 70% of the rainforests in South America.
Would a more sensible route be to either reduce the demand for cattle (import restrictions?) or put a carbon tax on the purchase of cattle products to reduce demand?
I’m not a scientist, but it seems obvious to me to try and change the biggest thing in order to make a significant effect.
Am I barking up the wrong tree (forgive the metaphor)?
walrus: the term is used precisely in order to distinguish genuine sceptics from those who claim to be sceptics but believe any nonsense that they hear so long as it corresponds with their prejudices.
I refuse to apply the term “sceptic” to people so gullible that they believe the unmitigated nonsense spouted by the the usual suspects, while applying a completely different standard to all other science.
What other term should we use? They are in denial. They deny. It’s what they do. And those who get shirty about the term generally do so only in order to distract from the concrete issues.
Let me say again: they are not sceptics. Self-labelling is not enough. You have to actually be sceptical.
Meanwhile, worldwide meat consumption is rising rapidly as developing countries increase meat consumption and adopt western-style industrial farming practices and the GHG emissions that go with them … and the epidemics of “diseases of affluence” that accompany consumption of large amounts of animal foods.
Comment by SecularAnimist — 16 Mar 2011 @ 10:02 AM
I think the sceptic and denier terms have been misused frequently. While “deniers” can be said to mistrust science, believe global warming is a ____ (insert your favorite word here), or are just gullible people who believe whatever they want. True sceptics have taken the time to weigh the science, and conclude that we do not have enough data to draw confident climate conclusions. Most of the people that I have found who are truly “sceptics” believe that CO2 is causing warming, just not to the degree some claim.
Most sceptics are sceptical for two reasons: 1. There is a natural component in addition to the human component, which has been neglected or downplayed by many, and 2. Many of the forecasted effects of global warming have been exaggerated.
The first reason is largely scientific, and can be resolved with continued research. The second part is much more a PR problem, with too many people making outrageous claims. Many people believe these claims. Are they as gullible as the others?
Dan H: Well, you would say that, wouldn’t you? After all, you think that you have weighed the evidence*, yet you have managed to cling to your preconceptions and ignore fully half of the evidence. Coincidentally, all the ignoring has been in one direction.
So, is someone guilty of exactly what we are talking about the person who should be going around saying what a sceptic is or isn’t?
Clean up your own house first. Start by correcting the errors in your last comment – natural variability has not been ignored, and observed effects of global warming fall largely within forecast values (except for the Arctic ice melting unexpectedly fast).
(Of course, you will deny this, too. But isn’t that where we came in?)
* You seem to honestly believe this, despite all the times your errors and omissions have been explained.
Completely off topic (I’m not even trying to hide it). I know everyone enjoys using reCaptcha as a punching bag. I just thought I’d share a couple of articles that seemed interesting (possibly could help someone).
[Response: I’m sure he will. In the meantime, note that this is a discussion paper (not yet peer reviewed), and that, (my mistake) curiously, there is only a a very short period in which you can start such an analysis in order to get the result reported. ~10 years earlier, or ~10 years later, the answer is opposite. More on this soon. – gavin]
Didactylos – It would seem to this layman that Houston and Dean are among those to be associated with the Army Engineering fiasco responsible for protecting New Orleans from inundation, not associated with climate except in the form of mitigation/adaptation. In addition, they are simply applying statistics to existing data, like any number of other “auditors”.
My question wrt sea level is how does any research or analysis take account of tectonic activity? What effects on ocean volume have all the recent major earthquakes had? Is anyone keeping records of the shape, size, and depths of the oceanic trenches?
Houston and Dean both are emeritus, according to the affiliations listed.
That said, I actually thought that their paper showed some evidence of care–though, since they are out of their field, one does wonder what a more expert eye would see.
For example, although their analysis is primarily based upon 57 US records (including Pacific possessions), they do compare this analysis to global analyses in the published literature.
My biggest question about the paper would be that they seem pretty dismissive of the satellite data, which most others seem to consider pretty reliable–though of course there are all sorts of technical challenges for doing satellite measurement right.
Well, that and Gavin’s point. That does tend to suggest cherry-picking.
#199–Houston and Dean *don’t* take tectonic activity into account. Their justification for that is that, since they are attempting to measure acceleration, not velocity, isostatic changes should be pretty much linear over the timespan under consideration, and hence can be ignored.
Makes me feel a little uneasy, but at first blush the logic seems OK.
“… Axel Kleidon of the Max Planck Institute for Biogeochemistry in Jena, Germany, says that efforts to satisfy a large proportion of our energy needs from the wind and waves will sap a significant proportion of the usable energy available from the sun. In effect, he says, we will be depleting green energy sources. His logic rests on the laws of thermodynamics, which point inescapably to the fact that only a fraction of the solar energy reaching Earth can be exploited to generate energy we can use….
… Humans currently use energy at the rate of 47 terawatts (TW) or trillions of watts, mostly by burning fossil fuels and harvesting farmed plants, Kleidon calculates in a paper to be published in Philosophical Transactions of the Royal Society. This corresponds to roughly 5 to 10 per cent of the free energy generated by the global system.
“It’s hard to put a precise number on the fraction,” he says, “but we certainly use more of the free energy than [is used by] all geological processes.” In other words, we have a greater effect on Earth’s energy balance than all the earthquakes, volcanoes and tectonic plate movements put together.
Radical as his thesis sounds, it is being taken seriously. “Kleidon is at the forefront of a new wave of research, and the potential prize is huge,” says Peter Cox, who studies climate system dynamics at the University of Exeter, UK. “A theory of the thermodynamics of the Earth system could help us understand the constraints on humankind’s sustainable use of resources.” Indeed, Kleidon’s calculations have profound implications for attempts to transform our energy supply….”
Hank: is he advocating biofuels, then? After all, “surface life generates orders of magnitude more chemical free energy than any abiotic surface process”.
I’m not sure that he adequately addressed direct sunlight. After all, that’s the energy input, and using it “at source” is obviously a sensible route.
And exactly how radical is it? I’m not sure it’s new at all, because we already know about the limitations of renewable energy. The laws of thermodynamics simply place practical limitations. Developing the theory further is nice, but it really doesn’t change anything.
I confess, I only skimmed it. It’s not exactly short.
Kevin@201 – Is sea level and the acceleration/deceleration of sea level rise not affected by the rate that various tetonic plates are diving under others? The recent quake near Japan shifted the whole country, as well as the position of the planet, it seems that would have had an effect on apparent sea level changes for the month and this year, not to mention the effect the tsunami would have had on tide gauges everywhere. One would think there are a number of things at play with sea level beyond temperature expansion and ice melt.
The very day that the EPA made its finding about CO2, the Kremlin-friendly business daily Kommersant published allegations that British sientists fudged the data from Russian weather stations. Their “scientist” was the economist Andrei Illarionov, a former adviser to Putin and Chernomyrdin (Soviet Gas Ministry/Gazprom)). Later Andrei Illarionov worked for the CATO.
Kommersant is owned by Alisher Usmanov, a billionaire Gazprom political operative.
The Kommersant article was abridged, translated into English, and published by the Russian government’s official press agency, RIA Novosti.
The Novisti article was subsequently cited (actually mischaracterized) by Virginia’s Attorney General Cuccinelli in his suit against the EPA.
AG Cuccinelli cites an agency of the Kremlin when he attacks US scientific agencies.
I know Cucinelli’s dad is a career gas lobbyist who gives the son campaign money.
The father reportedly has business interests in Europe and Latin America.
I would like to know if these European clients are entities with connections to Russian gas companies. Gazprom is part of the Russian government. I think most of them are.
The father’s company is in public relations. I wonder if foreign money goes to this company for “professional services,” but really helps fund attacks on climate science. Maybe something like money-laundering really is happening.
I often email Cuccinelli’s deputy W. Russell and ask about this, but they don’t respond.
I think we need to know who is giving money to our elected law enforcement officers. Maybe they are getting money from foreign governments that is laundered by paying for professional services.
Cuccinelli cites an article published in Alisher Usmanov’s Kommersant, so I think we need transparency about how the Cucinelli family earn their money and pay for their political campaigns.
Who will investigate Virginia’s Attorney General since he won’t be transparent?
I know that the FBI investigated Congressman Weldon because he was a shill for the Russian gas company ITERA, and Weldon’s daughter got 500,000 dollars from ITERA for “consulting fees.”
Can the FBI investigate how Cuccinelli’s family profits from the gas industry?
I know Cuccinelli is a state official, so maybe the FBI can’t do anything. Isn’t there any law that protects us from Cuccinelli’s persecutions?
I think Attorney General Cuccinelli is the greedy “moneybags,” not the college professor Dr. Mann and the other climate scientists who work in government and academia.
Cuccinelli seems to have hijacked the AG office and is using it for his family’s financial benefit. He is using my tax money to persecute the great climate scientist Dr. Mann on trumped-up fraud charges. It is disgusting, immoral, and perhaps illegal, but we can’t see what is going on with the money.
Attorney General Cuccinelli is Catholic, and the Vatican has asked the U.N. to fight global warming. Students in Catholic educational institutions read about global warming in their science textbooks because it is good science and because Christians are supposed to be stewards of the earth and protect God’s creation. Are science teachers in Catholic educational institutions teaching lies about global warming? Is the Pope also a greedy liar who is perpetrating a hoax?
Perhaps Cuccinelli would like to use his power to force Virginia’s public schools to teach the fossil fuel-inspired junk science about the “hoax” of global warming; but Catholic educational institutions will continue to teach the science of global warming, not the propaganda spread by Kremlin moneybags and the fossil fuel industry’s political operatives.
The Pontifical Academy of Sciences is even having a workshop in the Vatican about the melting glaciers that starts tomorrow (April 2-4).
This organization has many very famous scientists. I read that Dr. Michael Mann has participated in a previous workshop of the Pontifical Academy.