Since much of this article involves climate models, Gavin Schmidt’s short piece in January 2007 Physics Today, “The physics of climate modeling”,might be of interest to the non-modeler/non-expert. Any legal way to link to it?
[Response: Well spotted! I’ll put in a link as soon as it is available on both the Physics Today and GISS websites. I don’t yet have the pdf (though somewhat anachronistically I have a hundred reprints if anyone wants them). – gavin]
If the mass of air in a mobile polar anticyclone comes from the high alitudes of a polar cell, and the extra CO2 causes high altitude cooling, then the mass of the MPA might be greater, winds will be stronger and cooler in the winter. This might explain the Orange killing frosts in SE USA, or the lower latitude of certain US Midwest Koppen classification. Just guessing.
Thank Rasmus for this very interesting review of recent litterature. In the last paragraph, I don’t understand clearly : “a warmer Arctic may imply less sea-ice and a greater heat loss to space, which must be balanced by heat transport from the lower latitudes”. I thought models expect no more heat loss to space, because of nebulosity and water vapour retroactions from a warmer artic ocean (DLF forcing confirmed by Francis and Hunter recent paper in Eos).
[Response:I probably should have phrased this a bit differently, as there may be several processes involved: reduction in the sea-ice albedo, enhanced downward long-wave radiation (DLF; Francis & Hunter), and enhanced exposure of the sea surface in itself leading to enhanced heat loss. These importance of the different processes differ with the season, as the albedo feed-back won’t do much in the polar winter, but will be very important in the summer. Thus, I should have said that the heat transport from lower latitude, and the DLF, may play a more important role in winter. -rasmus]
Well, what a long description of how the models do not give the correct results. You admit that even the weather models are wrong, yet the success of them, and their similarity to climate models is used as proof that the climate models are reliable. When are you going to accept that Karl Popper is right. If there is one exception to a law then the law is untrue. If the model cannot replicate one feature then it is wrong, no matter how many it gets right. The scientific establishment is continually papering over the cracks in the models, from a cold upper troposphere and the tropical lapse rate problem to a failure to acieve energy balance closure http://nature.berkeley.edu/biometlab/espm228/Wilson%20et%20al%202002%20AgForMet.pdf
What is needed is a paradigm shift, but then old models never die, only the professors fade away :-(
[Response: That kind of simplisitic Popperism only works for idealised ‘clean’ situations where the issue is black or white, right or wrong. That just isn’t applicable to the real world where the issue is much more nuanced – how right? what is the tolerance for error? does a new feature reduce the error or not? Therefore models (in the generic sense) are supposed to increasingly match observations as they get closer to the underlying ‘truth’. This is what happens – GCMs are better at storms than energy balance models, weather models are better yet. None are perfect. This is how it works in the real world – get used to it! (PS. the paper you link is about inconsistencies in data, not GCM models, so I’m not quite sure of your point.) – gavin]
My thanks for this posting, your ideas expressed here, represent the clearest understanding of the mechanics of the climate and weather phenomena that I share to date. The primary concern I have is there continues to be references to the NAO other large scale patterns without a definition of what drives these patterns. Is there any current work that you know of that is focused on defining the drivers for the large scale patterns?
If I understand your implications correctly, regarding the horizontal resolution, it appears to be in relation to the RCM interactions. If this is true and the combination of certain RCMs equates to a major GCM character it would certainly help in understanding the large pattern drivers. The question is how do you define or describe the RCM characteristics? Would it simply be the combination of barometric zones at latitude and mix that define the character of the pattern, or would it be a combination of humidity gradient, temperature gradient and barometric charcteristics at latitude or are there other characteristics or categories that need to be added and/or defined?
Certainly with the variability seen over the last 30 years in which we have detailed evidence there should be characteristics emerging that seem to signal large scale patterns or combinations of RCMs that together define a large scale GCM pattern. I wish to offer my humble congratulations, I don’t mean to be insulting; however, this is the first time I have seen on realclimate.org what I believe to be a description of climate phenomena that makes sense.
What is a “correct result” for scientific models ? In a complex thermodynamic system, you cannot expect a purely deterministic simulation as you suggest (one law > one phenomenon or one cause > one effect). I think meteo. or climato. models do progress on some features, less or no on others. A future problem may be the intrinsic limit of this progress, as models will become more and more complex (coupled with biogeochemistry models for example).
Well, I’m blown away by how much is known about extra-tropical storms and their relation to global warming, and all the modelling abilities, despite all the complexities and uncertainties (and me not understanding the whole of this article).
The models are improving, just as computer sophistication is improving. At a recent party we were talking about the pre-internet early 70s, key-punch cards, and how we had to write our own programs — to the amazement of the younger set present.
So when will there be a model that includes positive feedbacks, such a methane from melting clathrates and permafrost, & reduced albedo causing further warming, causing further methane releases and reduced albedo, and so on? Or are these already included in the current models.
I remember our computer prof very strictly warming us against writing an infinite “do-loop” and jamming the Berkeley CDS computer — that big monster brain at the center of the campus, I think nearly the largest one in the world at the time.
Comment by Lynn Vincentnathan — 29 Dec 2006 @ 5:38 PM
As a non-climatologist I’m fascinated by the incredible rate of progress in the development of climate models, and by how incredibly good they already are at simulating such a complex system.
However, I have not been able to find a good lay-person friendly summary of the state of the science including summaries of each model, their strengths and weaknesses, the processes modelled by each, and plans and time-frames for enhancements. Does such a resource exist?
I’m bemused by the wingers carping from the sidelines that because no model is perfect they are all useless. However, I can understand why someone with little understanding of what a computer model is would conclude from this article that they are all rather dodgy, rather than – as I interpret – very good already, but with many refinements in the process of being developed, tested and implemented.
“Anyone who has come up hard and fast against reality understands that there is neither a theory or a model that explains everything. There are always residuals, unexplained anomolies and people on the fringes who will hold onto those for dear life, weaving webs of conspiracy theories that focus only on what remains unexplained.”
Storms are spectacular and therefore very much discussed. In my opinion, fair weather should also get some attention, particularly when it is due to the “blocking high” type systems. Dynamics determine storm power, the highs determine largely where they hit. As a lay person I see it as a pinball game – with a tiltable board.
The storm tracks also impact local and regional climate. For instance in Europe an expansion of retraction by a few hundred kilometers of the frequent “permanent” high pressure over Central Russia determines if we have a cold northerly or a warm southerly flow. A “blocking high” may remain stationary for weeks at a time. Climate warming may impact these highs in ways that translate into regional climate shifts.
By your standard the current model of gravity is “wrong” because there is no solution to the “three body problem”. Yet we can use our model of gravity to propel the voyager spaceships billion of miles on a pinball tour of the solar system, fire the Cassini spaceship through the gaps in the rings of Saturn (twice), or hit a fast moving comet with mind bogling accuracy.
You are however on the right track with Popper. His notion was basically that theories can only be refuted, never proven. The notion is now a well accepted part of the philosophy of science. I have no idea what he said about the “usefullness” of theories that wait for decades or centuries to be refuted, but I doubt he advocated ignoring them.
[[I don’t yet have the pdf (though somewhat anachronistically I have a hundred reprints if anyone wants them). – gavin]]
I would love to have a reprint. May I email you my mailing address?
Re #13 It is not true that there is no solution to the three body problem, otherwise it would not be possible to navigate a spaceship. There is no analytical solution to the three body problem. Nor is there an analytical solution to climate modelling. Iteration is used in both space navigation and climate modelling. That is why computers are so neccessary.
But if the computations for the trajectory of spaceships varied from 1.5 to 4.5, would they really have sent a man to the moon? Yet the change in temperature predicted for a doubling of CO2 range from 1.5K to 4.5K and have remained in that band for over ten years.
Of course you are told by Gavin et al. where the models succeed, but you have to read between the lines to see where they are going wrong. That is where Rasmus’ post is so revealing. It is quite clear that the models do not have an answer to the Mid Latitude storm question. In fact Rasmus has not explained what it is.
Heat is transported to the poles at a rate proporional to the temperature difference. It the poles warm then there will be less heat moving polewards. How is it then that the poles will be warmer? We know from paleoclimate studies that polar amplification does happen, and that in the Eocene there were cold blooded crocodiles in Greenland. But the current climate models cannot explain that.
Re #16 Barton,
helium balloons do not fall upwards they float upwards. In order to use Newton’s Laws you have to consider all the forces, in this case graviation and bouyancy.
But as was pointed out to Eli on his blog, when Newton’s Laws did fail, predicting the perihelion precession of Mercury, then it required the paradigm shift of Einstein’s General Relativity to get to the real answer.
Correct me if I’m wrong – but if the stratosphere is cooling then the temperature gradient through the troposphere is increasing with consequent effects on such things as the non-adiabitic processes driving storms. For the same base level of temperature and humidity storm clouds will grow taller (limited by the actual tropopause of course) because the rising air will be unstable for longer.
In other words couldn’t the increase in wind speeds on the surface be at least in part due to increased vertical speeds within storms.
I have one wish – that someone will post a working film of Fulz’s Dishpan experiment – this simple experiment is a standard laboratory for undergraduate meteorologists and it explains a lot of things. Am I alone in this longing…
The ITCZ over the Indian Ocean is a thing to behold at the moment!
Re:15 “Heat is transported to the poles at a rate proporional to the temperature difference. It the poles warm then there will be less heat moving polewards. How is it then that the poles will be warmer?”
Apart from being a chicken and egg question (“If the poles warm how will they be warmer?) this is surely easily answered by the fact that the equatorial regions warm too. As long as there is a difference in the temperatures there will be circulation. N’est ce pas?
re #4 et al: remember the old unwritten law of science: One exception is the “exception that proves the rule.” I guess one needs two exceptions to disprove the rule.
The difficulty with models is that the deficiencies and gradiations are well and truthfully recognized (other than the unk-unks, of course) in their development stage, but get lost or buried in the operational stage, particularly, if resurrected, they likely would alter the hoped-for answers.
Is all this a backtrack from the all-the-rage assertions in 2005 of GW causing more and stronger (mid-latitude) hurricanes because we didn’t get ANY in 2006? Or am I just be too sensitive and paranoid? It is after all an astute post.
Climate science must sure be the only field where people feel they are seriously critiquing it by saying, “It doesn’t explain everything yet.” Do people say modern physiology is a flawed field, because we don’t yet understand how the brain works?
But if the computations for the trajectory of spaceships varied from 1.5 to 4.5, would they really have sent a man to the moon? Yet the change in temperature predicted for a doubling of CO2 range from 1.5K to 4.5K and have remained in that band for over ten years. Alastair McDonald
In a narrowing range. The shape of the curve matters. And since you think that orbital trajectories are so cut and dry, would you care to tell me where the asteroid http://en.wikipedia.org/wiki/99942_Apophis 99942 Apophis will be on the morning of April 13, 2036? It seems that the smart guys at NASA are having a little trouble deciding whether or not it is going to hit the Earth that day….
If you go further with this thought it would be useful.
If the heating at the ITCZ were to stabilize, so that the temperature imbalance between the Equator and the Poles reestablishes itself, what would happen to the current heat that has been transported to the Poles? (Depending on the temperatures in the Tropopause region, the combination of heat and water vapor could even reach the Stratosphere, where it would eventually distribute and provide the surface for the CL reduction of O3 and would delay the recovery of O3 in this region until HCl were to achieve solution and precipitate out.)
What if the heat at the ITCZ was constant would there not develop a fixed path of heat transport from the ITCZ to the Poles not unlike the THC in the ocean. Would this not be phenomena like a Jet Stream traveling at an angle from the ITCZ to the Pole and back again? (We have evidence of a dramatic South to North transport in the last year if you look at the Walker circulation and the deviation of the NH Jet Stream. (Add in the studies this past summer regarding the observation of the Hydrocycle transport moving north from the 20-25 Degree N region to the 30-35 Deg. N region.)
If this heat transport character were to become static would it not form a new pattern in the earths weather? Though there was evidence of this recently, it appears that currently the Jet Stream is stabilizing into its normal elliptical pattern. Does this mean that the heat transported to the Arctic is locked in and the latitude heat imbalance is reduced?
What is going to happen to all that heat that was transported to the Poles, will it stay there? Will this heat distribute through out the Arctic sea or is it more likely to convect upwards and radiate out of the water and into the upper atmosphere or space? So if the heat transport were to lessen, is it possible we will quickly see the Arctic Sea Ice return and the cycle begin all over again? What if the heat transport were to not lessen; but, become constant? So many questions and so little time…
Re #4, #15, etc: You might reflect on the fact that it’s impossible* to build an absolutely accurate model of any real-world process. Any model has to be a simplified approximation. For instance, we can model the Earth as a flat plate. That’s wrong, but it’s perfectly adequate for walking around the neighborhood. Or we can model it as a sphere, which is also wrong, but lets us usefully navigate on long ocean & airplane voyages. Or we can use a slightly oblate spheroid, which tells us where our GPS satellites and such will be – but it still doesn’t take into account e.g. me digging a hole to plant an apple tree, or the piles of worm castings that I find on warm, wet mornings.
By the same reasoning, climate models can never be completely correct, because there are always finer and finer levels of detail to consider, but they seem to do a pretty good job of pre- and post-diction. What other course would you have us take? Ignore tha bomb because we don’t know for sure whether it contains 1.5 pounds of explosive, or 4.5?
Sally, the idea of polar amplification is that the poles warm more than the equatorial regions, so the problem is not solved by tropical warming.
Dan, the point I am trying to make is not that the models don not explain everything. It is that the current models give the wrong results, but the scientists are blaming the data. When the satellites and weather baloons said that the the upper troposphere was not warming they blamed the satellite data. When they found a fix for that data then said that the balloon data must be wrong too by the same amount! In other words they have changed the data to fit the model. Now there is a problem with the energy balance closure, as Gavin pointed out, they are blaming the data again.
Coby, you wrote “All models are wrong, but some are useful.” However, some models are not only wrong, using them is dangerous. Global warming is happening much faster then the models predicted, and the glaciers and ice sheets are disappearing much sooner than the models showed. The life of the Arctic sea ice has been halved from 80 to 40 years, but could be much shorter than that.
Carbon dioxide levels control the surface temperatures, not those at the tropopause. The CO2 level sets the amount of land and sea ice and so the planetary albedo. The only way the loss of albedo from melting ice can be replaced is with more clouds. Increasing surface temperture does not neccessarily lead to more cloud so we are approaching a dangerous crisis.
Max Planck wrote “A new scientific truth does not triumph by convincing opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” ~Max Planck, A Scientific Autobiography and Other Papers, 1949
So I am not going to convince Gavin et al. that they are wrong. However, we do not have enough time for a new generation to grow up before disaster stikes. As Sergeant Frazer put it “We are doomed.”
thanks for the overview. There is one thing I’ve thought about for a long time but never found anything on that. Strong cyclones and high wind speed mainly occur in a zonal flow pattern. A meridional flow pattern with cut-off systems shows much less strong cyclones and wind speed. A zonal flow pattern is much less effective in energy transport from North to South than meridional flow patterns. A less strong temperature gradient from North to South could lead to generally more zonal flow patterns (because the energy transport to compensate the North-South gradient would be less), which would mean more strong cyclones and wind speed. Could this be a considerable effect? What do you think?
My subjective observation for Europe is that there is already some trend to more zonal flow, above all in winter. However I have not found a good objective method to verify.
If you put more energy into a system then you get more out. This will be more frequent or violent storms, but where probably depends on that butterfly in asia. To argue against this theory 2006 was the hottest year since records began but the quietest for hurricanes for some time.
I am sure that at the moment there are more questions than answers with global warming.
A forming El Nino in the tropical Pacific and observed large concentrations of dust blowing west off the Sahel might have had something to do with affecting the tropical systems in east and central Atlantic by knocking their high clouds into disarray. However, lots of action in the tropical Pacific and certainly the Philippines suffered numerous direct typhoon hits since June.
Stay tuned. More analysis is on the way.
Comment by John L. McCormick — 30 Dec 2006 @ 6:30 PM
It’s got to be questionable whether you can conclude more violent storms just because more energy in equals more energy out.
Im no expert on climate but as far as I see the basic case for stronger storms caused by global warming is simple. Weather people on tv reguarly point out that a certain storm is being strengthened by warm weather. For example sea surface temperature in the gulf of mexico is mentioned in hurricane forcasts for a reason.
Because in a warmer world we can expect sea surface or air temperatures in many regions to be high more often, therefore we should surely expect storms passing through warmer regions to be stronger on average than they would have been.
There are other factors influencing storm strength that sometimes make the warmth factor less important, but all that matters is that sometimes storms are in a position where a bit of extra warmth will make them stronger. In those cases at least some storms will be stronger in a warmer world.
RE #15 – “It is not true that there is no solution to the three body problem”
I was replying to #4’s narrow interpretation of Karl Popper. Iteration is an aproxomation, that’s why spaceships have “retro rockets” and the weatherman is sometimes wrong. The bigger the number cruncher the closer we get to the “truth” (or the shorter the time to find “truthiness”).
What we have with computerised “brute force” iteration is a practical but far from perfect solution for climate prediction. We may never find a solution to the maths but the success of spacecraft vivdly demonstrates we can do without it.
Disclaimer: I am a computer scientist and have worked on “brute force” LP solutions for logistical problems.
Hadley cells do not circulate straight from the equator to the Pole, as Polar amplification seems to be very strong recently there has been a pan Arctic shift in surface winds, implying different Hadley Cell configurations. The Arctic in some cases was warmer than the continents further South during this and last winter. This has caused unusual weather events, if colder air forms closer to the equator, then there can be an increase in wind speeds also especially in rain intensity in some locations.
Being a huge fan of models, I must disagree that they have to be rejected, they do incredible simulations, however they lack resolution, just like the data they use as input, lacks resolution. The world radiosonde network is way too sparse. MSU has proved to disappoint. Newer with older techniques (monitored at stations closer to each other) have to be devised to increase high res data input, with greater resolution and computing power, models will become even more impressive. I wonder if Polar Amplification as predicted to happen 20 years from now, which is largely happening at present, if the reasons for this phenomena are understood?
re: 28. “It is that the current models give the wrong results, but the scientists are blaming the data.”
Current models give the wrong results?? Peer-reviewed reference, please?
“When the satellites and weather baloons said that the the upper troposphere was not warming they blamed the satellite data. When they found a fix for that data then said that the balloon data must be wrong too by the same amount! In other words they have changed the data to fit the model.”
Again, peer-review reference? That is simply disingenuous and insulting to scientists everywhere. If fundamental data are wrong, correcting the data is not a “fix”. It is called quality assurance. That is a fundamental data collection process in science. Otherwise you have completely junk science.
Alastair : “However, some models are not only wrong, using them is dangerous. Global warming is happening much faster then the models predicted, and the glaciers and ice sheets are disappearing much sooner than the models showed. The life of the Arctic sea ice has been halved from 80 to 40 years, but could be much shorter than that.”
Hum, not all models agree with your view… of models ! For example, in GRL december : neither Winton 2006 nor Holland et al. 2006 conclude summer Arctic sea ice will certainly disappear from 40 yrs (just one projection for seven in Holland paper). And for Greenland ice sheets, we should be more cautious with recent GRACE measures (before any dispute on models quality): a recent instrument (PGR and other calibration uncertainties), a short period (2002-2006), a well-known strong regional variability (NAO+/-) and a 2.5 factor for different estimates of mass balance published since March.
Comment by muller.charles — 30 Dec 2006 @ 11:47 PM
The equator may warm less than the poles, but it’s a whole lot bigger, eh?
I’d think the heat energy added to moderately warm say ten degrees of latitude around the equator is still going to be more than is added to warm up the smaller, albeit somewhat warmer, ten degrees of latitude around each pole.
It is not clear to me what computer models have to do with scientific “laws” (besides, there is no commonly accepted definition of a scientific “law” – the term is used very differently, and inconsistently, by physicists, chemists and biologists, e.g., Dalton’s Law of Partial Pressures is very different from Mendel’s Law of Segregation).
Re #22 and #23
Evolutionary biology faces the same kind of scrutiny, from people whose believe system purports to explain everything (without, in fact, explaining anything).
“Models are metaphorical (albeit sometimes accurate) descriptions of nature, and there can never be a “correct” model. There may be a “best” model, which is more consistent with the data than any of its competitors….if our models are as complicated as nature itself, then we may as well not bother with the model and focus only on the natural situation. Simpler models often provide insight that is more valuable and influential in guiding thought than accurate numerical fits.” Excerpts from: The Ecological Detective: Confronting Models With Data, by Ray Hilborn and Marc Mangel. Models in Population Biology, #28 (S.A. Levin and H.S. Horn, eds), Princeton University Press, 1997.
“Heat is transported to the poles at a rate proporional to the temperature difference. It the poles warm then there will be less heat moving polewards. How is it then that the poles will be warmer? We know from paleoclimate studies that polar amplification does happen, and that in the Eocene there were cold blooded crocodiles in Greenland. But the current climate models cannot explain that.”
and later wrote:
“Sally, the idea of polar amplification is that the poles warm more than the equatorial regions, so the problem is not solved by tropical warming.”
and later wrote:
“So I am not going to convince Gavin et al. that they are wrong.”
Well. not exactly. The idea of polar amplification is that the contribution of greenhouse gases to atmospheric warming is greater in polar regions than at lower latitudes, because the polar atmosphere is so dry. This is a prediction that has been around for more than a hundred years, since Arrhenius’ original paper, is a prediction of most climate models, and is rather nicely confirmed by observations during the last three decades.
The energy balance in the tropics is currently not quite a zero sum game, with the energy in from the sun being slightly greater than the energy out (from the top of the atmosphere and by poleward movement of air and water vapor). That means that the pesky left-hand side of the heat transfer equation, the partial derivative of temperature with respect to time, will be positive and temperature in the tropics will be increasing. This temperature change tends to re-establish thermal equilibrium in the tropics and tends to re-establish the previous temperature difference between the tropics and the polar regions.
The net effect is a negative feedback that, with a delay, tends to re-establish/maintain the previous balances of energy input (high in the tropics and low in polar regions) and energy rejection to outer space (low in the tropics and high in the extra-tropics).
Actually, Alisdair seems to be the one who cannot be convinced that he is wrong.
Re #29 I agree with your reasoning Urs, but the other possibility is that as zonal flow increases, areas away from convergence zones would have less windy conditions. Imagine a global belt of high pressure at 30 – 35 degrees say.
I think that zonal and meridional flow actually appear out of the dynamics of the system as poles within which it oscillates. My imaginary hyper-zonal system would be inherently unstable and would be broken up by the formation of cyclones. This is what appears to have been happening over the Indian Ocean in the last week with the ITCZ.
I’m not necessarily implying that zonal and meridional are the only dynamic poles. I don’t think it’s reducible to 2 dimensions. But it seems reasonable to suggest that they are dynamically unstable points.
One way to think about this is that below a certain combination of angular momentum and temperature difference – the system will become non-turbulent and completely zonal. But I think that we are nowhere near that minimum, yet. We’d have to freeze or boil all the water for a start!
I’d reiterate my point about Fulzs dish pan experiment.
I think there is a linguistic problem with phrases like ‘more windy’, ‘stronger storms’ , ‘wetter’ or ‘drier’. The problem is that meaning and usage are often confused(cf. Wittgenstein). Sometimes these prhases are used with an implicit particularity, ‘here’ or ‘always’ and other times with an implicit multiplicity, ‘on average’ , ‘more often than’.
It seems there is a problem with using yearly averages to predict the behavior of mid-latitude storms. There will certainly be a strong seasonality, won’t there? It seems that a likely scenario for the mid-latitude regions is violent flooding in early spring followed by long summer droughts due to reduced snowpack. Mid-latitude Atlantic weather can also be stongly influenced by the remnants of hurricanes making their way northwards (as was the case for ‘the Perfect Storm’). Most of the major storm ‘events’ that make the news headlines don’t actually seem to be isolated events. If Katrina hadn’t encountered a tounge of warm deep water as she entered the Gulf of Mexico, she wouldn’t have amplified the way she did – so that should be viewed as two intersecting events. To be simplistic, if more energy is stored in a climate/weather system, the frequency of energetic events should increase, as well as the chances that multiple events will happen to coincide.
Thanks for the discussion of ‘nesting’ local storm models into large-scale GCMs. Is there any possibility that GCMs will ever be able to increase their grid resolution to the point where such procedures are unnecessary?
As far as the positivist Popperian philosophy of science goes, here’s the relevant quote:
“According to this way of thinking, a scientific theory is a mathematical model that describes and codifies the observations we make. A good theory will describe a large range of phenomena on the basis of a few simple postulates and will make definitive predictions that can be tested. If the predictions agree with the observations, the theory survives that test, though it can never be proved to be correct. On the other hand, if the observations disagree with the predictions, one has to discard or modify the theory. (At least, that is what is supposed to happen. In practice, people often question the accuracy of the observations and the reliability and moral character of those making the observations)” – Stephen Hawking, The Universe in a Nutshell, pg 31.
The prediction that warming would be most evident in polar and high-altitude regions is a test that climate models have survived. The aerosol issue, on the other hand, required modification of climate models to be accounted for, and the combination of clouds + aerosols is still uncertain, apparently. An important thing here is getting the most comprehensive and accurate set of observations possible, so that model predictions can be tested – which is why funding is so important. For example, models of ice sheet dynamics were apparently very wrong when it came to the behavior of the Greenland and West Antarctic ice sheets – something that field and satellite observations revealed. Unlike what some of the above comments say, it’s okay to modify a theory, but one must then also generate a new set of testable predictions with the modified theory. Under ideal circumstances there is an open flow of information between observation and theory.
Climate science denialists have certainly fit the model described by Hawking as well (see the final bit in that quote)! The fact that the denialists so seldom call for more comprehensive observational data (i.e. more funding for satellites and cruises, for example) is rather telling. Without these datasets (ocean temp profiles, ice sheet volumes and movements, etc.) there is no way to test the models.
Watch that and tell my why there is no discussion at all about the weather manipulation and how it may be having some affect on your problematic models. It’s all junk science until you wise up and factor in these crucial scenarios.
Barton, I am not explaining the problem very well, so I will try again.
According to the current paradigm, if the earth’s surface warms then the outgoing radiation will increase providing a negative feedback on the temperature rise. See Jim Dukelow’s comment #40. Thus, if the polar regions warm then more heat needs to be transported there to compensate for the increased heat loss. But if the poles warm more than the tropics, then the temperature difference decreases which is the mechanism that causes the heat transport.
The answer is that there is no negative feedback from the long wave radiation. Here I accept the point made by James in #27 and Gavin that earth science is fractal. A more detailed examination would probably show a minor effect.
The reason that the problem of mid latitude storms cannot be solved is not “because there are always finer and finer levels of detail to consider” but because the wrong model is being used. The current model is equivalent to a cubic earth. Just as with a spherical earth, it is correct for a first approximation of being flat. However, if you stand on a cliff and look out to sea then you can see the world is curved, but it can always be argued that it is because the binoculars have a fixed range, hence it only seems that the world is curved. This is what the scientists are doing with their satellites, radiosondes, and infrared gas analysers – blaming the instruments rather than accepting the facts.
I think you are oversimplifying the mechanism of heat transport to the poles, as well as ignoring the seasonality involved (the long dark Arctic winter). There are atmospheric routes which involve evaporation of water from warm sea surfaces and rising warm air over equatorial landmasses; this is the Hadley Cell circulation. There are also oceanic routes involving western boundary currents like the Gulf Stream in the Atlantic and the Kuroshio Current in the Pacific; these are physical phenomenon that arise due to the Coriolis force. It doesn’t seem to be like a block of metal where one end is kept hot and one is cool.
The most important thing is to monitor those atmospheric and oceanic routes to see what the heat transport is really doing – that means using bottom-moored sensors that provide a constant stream of data (twice-a-year cruises just don’t cut it), as well as the standard methods of looking at the atmosphere. A moister atmosphere can also store more energy due to the heat capacity of water. The arctic winter should continue to produce cold, dry air masses, and when these air masses run into huge warm moist air masses – there you have a violent storm. The steeper the gradients, the more intense the winds, and so on. During the Arctic summer, the reduced ice cover will result in lower albedo, but are you really claiming that the polar heat transport will result in warm Arctic winters? That seems highly unlikely due to the decrease in solar insolation in the winter.
What would it mean to ‘solve the problem of mid-latitude storms’?
I’d be more worried about the seasonal effects of the warming on the Arctic permafrost and shallow continental shelves around the Arctic, as well as the fact that warming water with less deep water formation means less dissolved oxygen in the northern oceans, and that could lead to plummeting biological productivity at the upper ends of the food web. Modelling biological responses to climate change (including human responses) is important but is also very, very iffy – which is why the most important thing is not to attack ‘the scientists’ but rather to promote accurate and comprehensive data collection all over the globe!
Note also that the Antarctic and Arctic ozone holes are also seasonal effects; chlorine adsorbed to ice particles is released when the sun hits in the spring. Seasonality can’t be ignored at the poles.
I’m rather interested in this. It’s my understanding that the polar amplification is due to
1. ice albedo feedback
2. negative feedback via evaporation in the tropics (much stronger at higher temperatures.
2 counteracts a strong water vapor greenhouse feedback, but all that extra evaporation is balanced by extra condensation somewhere, and this is why the mid to upper troposphere is expected to be a region of enhanced warming as well.
Now as pointed out, the ice albedo would have little effect in winter. But in summer, there would be increased absorption of heat in the open water. This is released in winter as the water cools and ‘tries’ to freeze. This takes a longer time, so there is less time for the air to cool past the freezing point (right?). In summer itself, the warming is not expressed so much as a temperature change. So actually, in temperature, the polar amplification is really expected to be more in winter.
There is also the reduced lapse rate in the polar troposphere. This should make the greenhouse effect on the troposphere+surface as a whole weeker, as the effective emitting temperature to space is not so much lower than that at the surface; the increase in the greenhouse effect should be weeker as well for that reason (right? – I’m trying to reason this out). In terms of W/m2, the change in radiative forcing might also be less just because there is less radiation being emitted by the surface or troposphere (as it is colder), so there is less radiation to absorb. Then there is the lack of water vapor, which means the increase in CO2 could have a larger impact than otherwise. Lack of high clouds would have the same effect. At http://www.grida.no/climate/ipcc_tar/wg1/253.htm, for the increase in well-mixed greenhouse gasses, it appears that the maximum forcing (at the tropopause, I assume) is in the subtropics (which makes sense – few high clouds, dry, warm), and the lowest is at the poles, particularly Antarctica.
It also occurs to me, though, that the low lapse rate at the high latitudes would tend to allow for greater surface warming relative to the troposphere as a whole, as convection is less likely to redistribute the warming upward.
So lower meridional temperature gradient near the surface, but higher meridional temperature gradient in the mid-to-upper troposphere – that’s my understanding. This would tend to weaken the vertical wind shear in the lower troposphere but strengthen it in the upper troposphere. I’m curious how that would affect midlatitude dynamics.
It occurs to me that the typical oceanic gyres generate east-west temperature variation from the north-south temperature variation, so unless the gyres speed up, I would guess the east-west variations would be reduced in response to a weaker meridional gradient near the surface. I would imagine that this would add to weakenning of the horizontal thermal gradients on the east coasts of North America and Asia, so I’m imagining especially reduced cyclogenisis in these areas. Am I on the right track?
I didn’t think that the greater heat loss from the polar regions due to their warming requires greater heat transport to the poles, because the higher temperature changes at the poles are being produced by greater heating at the poles (well, at the surface of the poles). Or is that the issue – that the heating is concentrated near the surface, and somehow the change over the whole depth of the atmosphere responds to surface heating with net heat loss?
From a textbook
(“Global Physical Climatology”, Hartmann, 1994) there is a simple scaling argument that suggests sensible heat transport by midlatitude eddies is proportional to
the square of the meridional temperature gradient
* the square root of d(potential temperature)/dz
* the square of the depth scale / square of coriolis parameter
* the 3/2 power of (g / potential temperature)
the reason for being proportional to the square of the meridional temperature gradient is that a given motion has net sensible heat transport proportional to the temperature gradient, but the eddie motion itself is proportional to the vertical wind shear, which is proportional to the temperature gradient.
The depth scale is proportional to the height of the tropopause for more unstable flow, but proportional to the meridional temperature gradient for more weekly ustable flow (leading to even greater response to the meridional temperature gradient). Is that why the smaller storms would be fewer while the larger storms might increase (as the the height of the tropopause may rise)?
It’s also my understanding that were it not for the instability that leads to eddies, the Hadley cell could extend from equator to pole, and in the dishpan experiments and in computer models, if I recall correctly, this happens until the meridional temperature gradient causes the circulation to break down. I’m curious – in order to balance to Angular momentum budget, a worldwide pair of Hadley cells would have to have both westerlies and easterlies; the winds would start out westerly near the poles, and then bend around as they head equatorward. I’m curious what’s expected to happen to the surface winds as the midlatitude storms weaken, increase, and move, and if that requires any shifting in the easterlies of the tropics?
One interesting thought I had is to compare simple models of idealized ciculations of various kinds and time dependent situations with the time average one-dimensional radiative convective model – ie to get some general understanding of how do variances in time and space, and their synchronicity (ie certain kinds of clouds occuring along with certain temperature variations at certain times of day, etc.), affect the temperature relative to a model in which everything is averaged out before calculating?
I apologize if this is all neatly laid out in one of the links – I haven’t had time to check them out yet.
2006 Pacific typhoon season First storm formed: May 9, 2006 Last storm dissipated: December 19, 2006 Strongest storm: Yagi and Cimaron – 910 hPa (mbar), 195 km/h (120 mph) Total storms: 23 official plus 2 unofficial Typhoons: 15 Super Typhoons: 7 (unofficial) Total fatalities: At least 2,532 Total damage: Unknown
Hurricane Ioke (was also Typhoon Ioke, international designation: 0612, JTWC designation: 01C, and sometimes called Super Typhoon Ioke) is the strongest hurricane ever recorded in the Central Pacific. The first storm to form in the Central Pacific in the 2006 Pacific hurricane season, Ioke was a record breaking, long-lived and extremely powerful storm that traversed the Pacific, reaching Category 5 status twice as a hurricane. As a typhoon, Ioke managed to achieve Category 5-equivalent one-minute sustained winds one more time before weakening.
Ioke did not affect any permanently populated areas in the Central Pacific or Western Pacific basins as a hurricane or a typhoon, but the storm passed over Johnston Atoll as a Category 2 hurricane and passed near Wake Island as a Category 4 typhoon. Despite its strength, Ioke only caused moderate damage to Wake Island, and was not responsible for any fatalities. Later, the extratropical remnants of Ioke produced a severe storm surge along the Alaskan coastline, causing beach erosion.
Oh and the overall weather forcast this year …
World faces hottest year ever, as El NiÃ±o combines with global warming
A combination of global warming and the El NiÃ±o weather system is set to make 2007 the warmest year on record with far-reaching consequences for the planet, one of Britain’s leading climate experts has warned.
As the new year was ushered in with stormy conditions across the UK, the forecast for the next 12 months is of extreme global weather patterns which could bring drought to Indonesia and leave California under a deluge.
Actually, it appears the El Nino event is breaking up in the ITCZ as we type. It appears to have broken down into eddies or pools of warmer temperatures and may be in the process of dissipating. (Note: That at the same time there is clear evidence of continued strong El Nino like weather patterns. This appears to drive the question set, is it the weather that drives El Nino or El Nino that drives the weather? Too bad we are do not seem able to define what drives the large scope weather patterns or oscillations yet.)
[Response: That kind of simplistic Popperism only works for idealized ‘clean’ situations where the issue is black or white, right or wrong. That just isn’t applicable to the real world where the issue is much more nuanced – how right? what is the tolerance for error? does a new feature reduce the error or not? Therefore models (in the generic sense) are supposed to increasingly match observations as they get closer to the underlying ‘truth’.
In reply to ‘Response’
Is it now evident that there is a crisis in the standard models? Is there an expected paradigm shift in the models as opposed to a need for model tuning? Kuhn rather than Popper?
In the case of this puzzle (Forecasting and explaining past changes to the macroclimate/the glacial/interglacial cycle), the standard models are obviously missing key fundament components of the physical processes.
From Broecker’s famous Angry Beast article (see attached link for details) in which he discussions macroclimate models and extreme climate changes:
“…No one understands what is required to cool Greenland by 16C and the tropics by 4 +/-1C, to lower the mountain snowlines by 900m, to create an ice sheet covering much of North America, to reduce CO2 by 30%, or to raise the dust rain by an order of magnitude. If these changes were not documented in the climate record, they would never have entered the minds of the climate dynamics community.
Models that purportedly simulate glacial climates do so only because key boundary conditions are prescribed (the size and elevation of ice sheets, sea ice extent, sea surface temperature, CO2 content, etc;)”
1) In the Angry Beast article Broecker postulates that the Younger Dryas was caused by a fresh water pulse, from Lake Agassiz. Subsequent data has shown that the fresh water pulse hypothesis is likely not correct. See attached paper below for the data and another hypothesis.
2) An alternative hypothesis to Broecker’s non-linear knife edge hypothesis (Small natural or anthropogenic changes can force the macroclimate from one mode to another mode and hence create the massive ice sheets and so forth.) is the hypothesis there is a massive semi-periodic external forcing, that forces the macroclimate from one mode to another. (P.S. The massive natural external forcing function is not the THC.)
Link: Reduced solar activity as a trigger for the start of the Younger Dryas?
[Response:A few thoughts regarding ‘Is it now evident that there is a crisis in the standard models? Is there an expected paradigm shift in the models as opposed to a need for model tuning? Kuhn rather than Popper?‘. Since models are never (at least very rarely) perfect as they mimick only the essential features of whatever they represent, one may always find a point beyond which they are no longer valid – i.e. beyond their limitations. To name a couple of silly examples to illustrate this, the climate models cannot tell us much about the single air molecules in a given region or whether a stroke of ligthening will hit a particular house. This does not mean that the model is falsified, eventhough one may argue that both may have some effect (albeit miniscule) on the atmosphere. I think therefore one has to define for which limits model is valid, before you can validate them in earnest and call for Popper. GCMs with a ~1-degree resolution and cloud parameterisation were surely not designed to investigate stormsin details (as they were not deigned to study molecules or lightening), although it’s reassuring that they without any fingering do produce storm-like featuresselves by them selves (this cannot be said about the molecule or lightning), although not as perfect as we would like them. But since storms play a role in the larger picture, it is important that they are represented in a realistic fashion statistically, e.g. through a mix of sub-grid parameterisation and grid-resolved numerics (as the molecules are). There have been numerous studies with numerical atmospheric models, may of which bear close resemblance ot operational numerical weather forecasting. Their continual use and important contribution to everyday life, suggest that these models in general give useful prognoses. -rasmus]
Here are a few of the links that I have found useful. For me the beginning of my search began in relation to a story from NASA over 8 years ago, where the EOS/MODIS packages seemed to indicate a lower upper altitude temperature and less CO2 then theory at the time supported. It was then theorized that the CO2 was preventing the warmth from the surface from reaching the upper troposhere. Last spring it was announced that the cooling there was false and the issue was related to the variation in the satellite altitude versus atmospheric altitude. Now it appears that the CDIAC suggests that the cooling in the 250mb range appears to be valid.
Given this flux in the data and our lack of understanding we seem to have in regards to data there is a good reason to check your sources. This is a good reason that most of the following references are related to government or educational institutes as they usually are the source of the data most sites use in their data analysis. Generally, the data I am referencing here has been filtered for layman consumption, if you have any issues there are plenty of folks either here or on the various sites that are more then willing to assist you. Good Luck in your research.
Re #54 and unsubstantiated claims about “El Nino breaking up in the ITCZ”, here’s the actual forecast from the NOAA Climate Prediction Center on the current state of affairs in the Pacific:
“Synopsis: El NiÃ±o conditions are likely to continue through May 2007. Equatorial Pacific SST anomalies greater than +1ÂºC were observed in most of the equatorial Pacific between 170ÂºE and the South American coast (Fig. 1). The latest SST departures in the NiÃ±o regions are between 1.1ÂºC and 1.3ÂºC, except for NiÃ±o 1+2 (Fig. 2). The increase in SST anomalies during the last several months has been accompanied by weaker-than-average low-level equatorial easterly winds across most of the equatorial Pacific and negative values of the Southern Oscillation Index (SOI). Collectively, these oceanic and atmospheric anomalies are consistent with the early stages of El NiÃ±o in the tropical Pacific.” http://www.cpc.noaa.gov/products/analysis_monitoring/enso_advisory/
The World Meteorological Organization agrees with this prediction:
“The WMO said its latest readings showed that a “moderate” El NiÃ±o, with sea temperatures 1.5C above average, was taking place which, in the worst case scenario, could develop into an extreme weather pattern lasting up to 18 months, as in 1997-98. The UN agency noted that the weather pattern was already having “early and intense” effects, including drought in Australia and dramatically warm seas in the Indian Ocean, which could affect the monsoons. It warned the El NiÃ±o could also bring extreme rainfall to parts of east Africa which were last year hit by a cycle of drought and floods.” http://www.cpc.noaa.gov/products/analysis_monitoring/enso_advisory/
site that demonstrates that the heat content in the upper 40 meters is dissapating along with a gradient rise in the 20 Deg. C Isoterm level.
Again the best forecast is a 50-50 proposition in most cases. In the case in which you have trimodal indicators it looks to be a 33-33-33 proposition at best. No one said this is a science yet; much of the good work here is still an art.
I suppose this fits here, since the subject of snowstorms was the basis for the TV segment.
I ordinarily don’t watch the Fox or as some call it Faux News Channel, but since my TV is old enough to be in graduate school, I have recently been forced to channel surf as the sound abruptly cuts out on MSNBC and CNN among others.
So, today, while watching the wrap up of the funeral for former President Ford, I wound up on Fox. Neil Cavuto had as two of his guests a man representing some kind of business association and that scion of science frequently referenced here, Dr. Patrick Michaels, today listed as affiliated as a fellow of the Cato Institute (these are the people who believe that the government that doesn’t govern at all governs best–I think that’s called anarchy).
Anyway, Cavuto put forth the hypothetical that the liberal media being mostly in the northeast, have fixated on the unusually warm weather there the last few weeks as evidence of global warming for sure, while ignoring the two massive blizzards that have buried the Denver area.
Michaels and business boy both agreed that the media, especially the NY Times (recently “broadly” criticized here, I recall) and CBS, seize upon weather events that support AGW, but ignore ones that don’t, e.g., the blizzards.
A graphic was posted showing some headlines from the Old Gray Lady over the last 80 years that said we were experiencing record warmth or cold and that a new ice age was coming or global warming.
In other words, extrapolating recent weather events to support some kind of disaster scenario that requires the government to act, obviously in violation of Cato Rule No. 1.
Michaels said that there actually hasn’t been warming in the NE U.S. as would be predicted by the AGW proponents and that in general, the warming of the first part of the past century is due to increased solar activity.
But here’s the good part people. It isn’t just the fearmongers at the Times the fair and balanced folks want you to be wary of. The real problem is that AGW propagandist Al Gore.
According to Michaels, Gore takes the fact that most scientists agree that there is some AGW and extrapolates that to mean that most also subscribe to a gloom and doom scenario when there are actually only a few. Michaels also said that the gloomster researchers also only report the results that support their end of the world theories.
Neil seemed relieved to know that Dr. Pat was out there, along with a handful of others, ready to set the record straight, no matter how many Weekly World News-like headlines the Times and CBS throw up.
Thank goodness all those government workers and researchers and the school children were home today so they could get straightened out on this very important issue. I suppose the next step for Dr. Pat and Cato is a public burning of DVDs (will DVD’s burn?) of An Inconvenient Truth.
I have a question that someone here will probably know the answer to about the co2-warming relationship being logarithmic. I was thinking today about the rule of thumb that doubling of co2 from any level causes a constant warming (~3C I think it is). I know there is an extreme upper level where weaker saturation bands come into play and this rule of thumb breaks down. But I realised there must be a lower level where it breaks down too. Otherwise there are an infinite number of doublings preceeding preindustrial level:
So I was wondering if the relationship at the lower end isn’t really logarithmic at all. Is there a proper diagram of the relationship somewhere rather than the logarithmic curve that I have in my head?
When talking about the ” polar amplification”, it is important to keep the vertical asymmetry in the meridional difference in the warming trend in mind. For the surface temperature, polar warming is ususally larger than tropoical warming. But this does not mean the decrease in poleward heat tranport because of the reduction in tropics-pole temperature gradient. In the trosphere, the atmospheric warming in tropics is usually larger than the warming in polar area in most of the climate simulations for doubling CO2, so there is in fact increase in the tropics-pole atmospheric temperature gradient which could enhance the poleward heat transport. Even the poleward sensible heat transport is reduced, the poleward latent heat transport must be increasing so the changes in mid-latitude storms is inevitable. Cai(2006),(Climate Dynamics,Vol 26, 661-675) proposes such a dynamic feedback an important factor responsible for the spatial heterogeneity of atmosphere and surface warming.
My first guess is that starting with no greenhouse gasses, the greenhouse effect would increase linearly with the first additions, and then start to become more logarithmic. (Of course, that’s without some of the very nonlinear feedbacks one would encounter, such as the transistion into/out of a Snowball Earth state.)
But I would also enjoy knowing this in more detail.
obviously when I wrote in #50 that decreasing meridional temp gradient would lead to decreased east-west temp variation (in SSTs, I meant) unless ocean surface gyres strengthened, I was not accounting for upwelling, which in some cases could enhance temperature variations in an otherwise warmer world (such as around Antarctica, where there could be a positive feedback between the winds driving the upwelling and the temperature gradient driving the winds). Also, when I mentioned decreased cyclogenesis around the east coasts of Asia and North America, I was thinking of winter.
There is also a likely Hadley cell reconfiguration. Theoretically, there should be a semi permanent High pressure close to the North Pole, completely in sync with the Arctic Ocean Gyre, expelling Ice to the North Atlantic, sort of keeping a balance of permanent Ice. However I have noticed very often low pressure systems at the North Pole, this in itself does the opposite, keeps old ice from escaping the Arctic Ocean, a counter balance for the present warmer days. But having more peristent low pressure systems near the Pole causes a complete change over from the normal Hadley setting, ulimately having an effect on the Ferrel mid latitude cell, a change in location or a readjustment a bit more complex than the standard 3 cell model from equator to Pole, causing a significant change of weather at the mid latitudes as well.
In response to the comment: ‘Since models are never (at least very rarely) perfect as they mimic only the essential features of whatever they represent, one may always find a point beyond which they are no longer valid – i.e. beyond their limitations.’
Is there a Crisis with the Climate Models? Any facts to support an alleged crisis?
What is a crisis in a field of science?
There are problems with the foundation of macro climatology. The current status of the field is analogous to geology during the period when the tectonic plate hypothesis was being developed. Fundamental concepts that are part of the macro climatology canon appear to be false, based on recent data and analysis. Rather than addressing these newly identified problems as an opportunity for a breakthrough, the perfectly nature and common reaction is to attempt to modify and tune the models to explain the data, with no changes to the canon, or to just ignore the conflicting data.
[Response:Sorry, but this analogy is not clear, and what you write is your (hand waving) view, but constitutes no convincing evidence. -rasmus]
Facts and work by others that supports the alleged statement, ‘Macro climatology is in a crisis’.
1) Richard Muller and Gordon MacDonald’s finding that glacial cycles follow a 100-kyr cycle which matches a 100 kyr cycle in the earth’s orbital inclination, not eccentricity. See attached link to their 1997 paper for details. (P.S. Cause of warming has nothing to do with interstellar dust. No surprise in the abanding of eccentricity as everyone knows, the insolation changes associated with eccentricity could not have possibly have caused the end of the glacial cycle.)
Link to Muller et al’s paper. http://muller.lbl.gov/papers/NAS.pdf
[Response:I’m not following you – what is your definition of ‘macro climatology‘ , and how do you take Muller et al’s paper paper to be an evidence for the falsification of the climate models? -rasmus]
2) Based on finding 1) next question is how could orbital inclination possibly affect the earth’s climate. (Something is required to cause the massive warming that ends the glacial cycle. Important, as without the warming the earth stays in the glacial state.)
4) Next two questions. A) How could changes in the geomagnetic field possibly affect the macroclimate? B) What could possibly be causing the changes to geomagnetic field? (P.S. It appears based on work by others, that the earth’s inclination can change 6 degrees per day. That finding is a real surprise! Need a side trip to check the geomagnetic models/cannon and the data that supports an alleged rapid change. The speed and cause of the geomagnetic field change might be important, if we were concerned about rapid climate changes.)
Answer to 4 A) question is modulation of GCR, see this link to Kirby, Mangini, and Richard Muller’s 2004 paper, for an outline of the basic hypothesis and supporting data.
Link to Kirby et al’s paper. http://arxiv.org/pdf/physics/0407005
5) The devil is of course in A) the darn details of the GCR affects and of course we need B) proof the GCR phenomena is real.
Starting with 5 B), (See attached link to Palle et al.’s 2004 paper). See figure 2 and figure 3. As noted in the paper, there is very good correlation (significant at the 99.5% level) to low level cloud cover and GCR levels 1985 to 1995. Starting in 1993, the low level cloud cover starts to reduce at minus 0.065% per year. We should note that GCR/cloud level data (this observation) is at a time when the solar activity is at its highest level in 8000 years and the solar large scale magnetic field has doubled. That fact explains the minus 0.065% drop in low level clouds.
[Response:I don’t find these figures very convincing, as the curves are short, with only two cycles, and a host of other factors may affect the cloudiness. e.g. ENSO. I think the figures are interesting, but the jury is still out on that one. -rasmus]
Moving to 5A). See link to Brian Tinsley and Fangqun Yu’s attached paper. Palle et al’s paper notes the data supports the electroscavenging hypothesis, see Tinsley and Yu’s paper for details. The electroscavenging process is dependent on the magnitude of current movement ground to ionosphere. It is assumed the recent (1993) observed ‘Sprites’, where Sprites is the name of the massive charge discharge from ionosphere to cloud tops is being caused by the solar activity. (See attached link which discusses the increase and changes in solar activity.) It is assumed the change in solar activity (coronal holes) are causing the increase in the electroscavenging process.
Regarding the El Nino question: given the timescales of the El Nino phenomenon (which is still poorly understood, as are all the other multidecadal oscillations, of which El Nino is by far the most studied) we are certainly due for another one. One effect of El Nino is a greater then usual transfer of heat from the tropics to the mid-latitude regions, so El Nino may be a tropical mechanism of ‘letting off steam’ brought on by a slow buildup of thermal energy in the equatorial surface ocean, which periodically lets off a pulse of energy – just a guess.
The datasets that L. David Cooke (#62) links to actually show a warming ocean consistent with the onset of an El Nino pattern. Note also that the ‘baseline period’ for the ‘anomaly’ is 1971-2000 – which seems like a pretty bizarre baseline considering that global warming has been going on for decades now. I wonder what the anomaly would look like if the period 1965-1975 (or 1975-1985) was used as the baseline? – I imagine it would be quite a bit higher. It seems to me like someone is fudging the numbers to make the anomaly come out lower then it should be.
The real question of interest is this: what effect will global warming have on the El Nino / Southern Oscillation? Will you see more frequent El Ninos or less frequent El Ninos or no change at all? One thing is clear: we should expect to see equatorial temperatures continue to rise (as evidenced by the melting of the Andean glaciers, Kilimanjaro snowfields, and Mexican volcanos) which means that equatorial ocean temperatures, i.e ocean heat content will also continue to rise. About half of expected sea level increase is expected to be due to the thermal expansion of water as it warms; the other half is from melting land-based glaciers.
I prefer to discuss observations rather then models for the simple reason that it’s a lot harder to question the actual data (though the denialists will certainly try). The models provide a theoretical basis for understanding how the oceans and atmosphere behave, and are critically important because they produce testable predictions (such as the prediction that polar regions and high altitude regions would be the first to show the effects of global warming), but without good data (for example, Lonnie Thompson et al’s high altitude glaciar core data) no tests are possible.
One other point (RE #67) the 3-cell Hadley model is what you get when you imagine that the Earth has no continental landmasses – it’s a conceptual model only. The real Hadley circulation is far more complex and is heavily influenced by the landmasses, though you still see the desert belts due to the dry descending air. What global warming has done (another model prediction that has come out correct) is put more water moisture in the atmosphere, meaning that the heat capacity of the atmosphere has increased- meaning that more heat will be transported to the Greenland, Arctic and Antarctic regions – meaning more and more rapid melting of the ice sheets as we transition into a climate regime the Earth hasn’t seen in some 3.3 million years… at least. How will the increased moisture in the atmosphere affect the Hadley circulation? Will we see monsoons and unprecedented flooding in mid-latitude regions?
Even if in the long run the Arctic is fairly warm, it seems quite likely that Arctic winters will continue to produce cold dry air masses in the near-term period that will generate intense storms as they encounter warm wet air from the tropics – but who knows where these storms will occur? They’ll probably be much less damaging then the hurricanes, at least.
Re “So I was wondering if the relationship at the lower end isn’t really logarithmic at all. Is there a proper diagram of the relationship somewhere rather than the logarithmic curve that I have in my head?”
You have it right. The logarithmic relation only holds for a few orders of magnitude. It doesn’t raise the world’s temperature 3 K to go from one CO2 molecule in the atmosphere to two.
Re 68: In Southern Finland we had two weeks of snow in the beginning of November. E.g. In Helsinki December was 6,2C warmer than the average between 1971-2000. The Finnish Meteorological Institute has given the reason for the record warmness warm wet air coming from the Atlantic.
Related to the topic of this thread, it looks like the energy stored into the ocean is now released through these lows and storms to the northern parts of Europe and eastern America. However, the temperatures are now so high, that somewhere in Northern hemisphere extremely cold conditions must prevail. Does anyone know where?
As someone who spent most of the professional life developing “wrong” models I have to comment on this.
You are right but there’s more to the story. Crude models are usually useful for qualitative understanding of the physics involved rather than for quantitative prediction purposes. The better (more detailed) the model the better the chance (usually, but see below) for a more accurate prediction. Conversely, “better” models are more difficult to use if you want to explain something, simply because there are too many interacting mechanisms involved. In climate modeling there is even an expression “kitchen sink model”. This means – throw everything in and see what happens.
There is a famous – and very relevant – exception to the rule. Cane-Zebiak El-Nino model is bare-bones simple yet it is used for prediction. What is curious is that “better” models (GCMs, the very same kind that are used for long term climate forecast) do not show any improvement (in fact, it’s the other way, I believe) in forecast accuracy over C-Z model, at least that was the case until recently. What do you make of this? Indeed, some models are useful. Moreover, some models are a lot more useful than others.
Alistair points out (and I agree) that the climate models are “too wrong” to be used for meaningful prediction. You can argue against it intelligently, as Rasmus did in comment to #56, but it is disingeneous to dismiss it with one sentence brush-off.
Wally Broecker (William – thanks for the link) makes a lot broader point about models inadequacy, similar in spirit to what I was trying to say in a couple of previous threads, but a lot better substantiated, of course. For some reason, the modeling community is not arguing with him. Guess why.
See, you don’t need to be a denialist to mistrust the models.
[Response: Well this member of the modelling community often argues with Wally on this point – but not by claiming that models are perfect, but by demonstrating that they can be useful. I’ve mentioned this paper before I know, but LeGrande et al 2006 is a great example of a model doing a good job at putting disparate pieces of evidence into a constistent context, despite being imperfect. -gavin]
I have no difficulty downloading the paper using the Mozilla browser, so I have copied the abstract for anyone having problems here:
K. Georgieva, C. Bianchi and B. Kirov “Once again about global warming and solar activity” Mem. S.A.It. Vol. 76, 969
Abstract. Solar activity, together with human activity, is considered a possible factor for the global warming observed in the last century. However, in the last decades solar activity has remained more or less constant while surface air temperature has continued to increase, which is interpreted as an evidence that in this period human activity is the main factor for global warming.We show that the index commonly used for quantifying long-term changes in solar activity, the sunspot number, accounts for only one part of solar activity and using this index leads to the underestimation of the role of solar activity in the global warming in the recent decades. A more suitable index is the geomagnetic activity which reflects all solar activity, and it is highly correlated to global temperature variations in the whole period for which we have data.
“It appears based on work by others, that the earth’s inclination can
change 6 degrees per day” — Astley
“I don’t find these figures very convincing” — Rasmus
Me neither. I can’t find any source for that notion — is that claim taken from Velikovsky? If it claims a change in the inclination of the magnetic axis, it moves not remotely that fast. In the inclination rotational axis of the planet, a six degree change would have been noticed — putting aircraft and satellites and bird migrations off course, changing sunrise and sunset timing, and causing the oceans to slop around dramatically.
Who is the source for the notion the planet’s inclination can change six degrees a day? Why do you find it believable?
I am fascinated with the following. The notion is, bi or multi modal “bands” (as in energy bands in quantum physics) within which the climate system can reside. It cannot be ruled out that modes may encompass degrees of meridionalty of jet stream tracks. One migh envisage modes ranging across a spectrum from the zonal extreme to a highly meridional extreme. Consider the interaction of a scheme such as this with the energy / heat content of the oceans or of the ocean-atmosphere system. This paradigm may also be of use when examining oscillation modes of SST patterns and of the overarching ocean – atmosphere system (e.g. ENSO, PDO, AMO, other yet to be understood ones). Just some food for thought when assessing the degree to which the Earth’s thermal state may or may not influence mid latitude storm strength and frequency of occurrence.
Re #64 and others: I happened to run across a site – http://edgcm.columbia.edu – that has a version of a climate model that you can run on your PC, at least if you’re unfortunate enough to be stuck in the Windoze world. So download the model, plug in your scenarios, and see what happens :-)
And re #68: ” It appears based on work by others, that the earthÃ¢ï¿½ï¿½s inclination can change 6 degrees per day. That finding is a real surprise!”
I’ll say! Would one of those others happen to be Veilikovsky?
Are there any models yet, taking released methane feedback in account while making global warming predictions?
The once-frozen peat bogs of Siberia – bigger than France and Germany combined – began to “boil” furiously in the summer of 2006 as methane bubbled to the surface. Exactly how much is being released into the atmosphere is unknown, although some estimates put it as high as 100,000 tons a day – which means a warming effect greater than America’s man-made emissions of carbon dioxide.
Methane clathrate, also called methane hydrate, is a form of water ice that contains a large amount of methane within its crystal structure. Extremely large deposits of methane clathrate have been found under sediments on the ocean floors of the Earth. The sudden release of large amounts of natural gas from methane clathrate deposits in a runaway greenhouse effect could be a cause of past and future climate changes. The release of this trapped methane is a potential major outcome of a rise in temperature; it is thought that this might increase the global temperature by an additional 5° in itself, as methane is much more powerful as a greenhouse gas than carbon dioxide (despite its atmospheric lifetime of around 12 years, it has a global warming potential of 62 over 20 years and 23 over 100 years). The theory also predicts this will greatly affect available oxygen content of the atmosphere.
Large uncertainties in the budget of atmospheric methane, an important greenhouse gas, limit the accuracy of climate change projections1,2. Thaw lakes in North Siberia are known to emit methane3, but the magnitude of these emissions remains uncertain because most methane is released through ebullition (bubbling), which is spatially and temporally variable. Here we report a new method of measuring ebullition and use it to quantify methane emissions from two thaw lakes in North Siberia. We show that ebullition accounts for 95 per cent of methane emissions from these lakes, and that methane flux from thaw lakes in our study region may be five times higher than previously estimated3. Extrapolation of these fluxes indicates that thaw lakes in North Siberia emit 3.8 teragrams of methane per year, which increases present estimates of methane emissions from northern wetlands (< 6-40 teragrams per year; refs 1, 2, 4-6) by between 10 and 63 per cent. We find that thawing permafrost along lake margins accounts for most of the methane released from the lakes, and estimate that an expansion of thaw lakes between 1974 and 2000, which was concurrent with regional warming, increased methane emissions in our study region by 58 per cent. Furthermore, the Pleistocene age (35,260-42,900 years) of methane emitted from hotspots along thawing lake margins indicates that this positive feedback to climate warming has led to the release of old carbon stocks previously stored in permafrost.
For most parts of the ocean, melting of hydrates is a slow process. It takes decades to centuries to warm up the water 1000 meters down in the ocean, and centuries more to diffuse that heat down into the sediment where the base of the stability zone is. The Arctic Ocean may be a special case, because of the shallower stability zone due to the colder water column, and because warming is expected to be more intense in high latitudes.
In the ocean, hydrates exist in a “zone of stability” under the seafloor in locations where water depths exceed 500m.
But the results of an expedition carried out by the IODP off Vancouver Island are putting a significant new perspective on this profile.
The international marine research organisation used the drilling facility and laboratories of the US research vessel Joides Resolution to retrieve core samples from a geological area known as the (northern) Cascadia Margin.
The pressurised cores pulled back on to the ship had copious hydrate deposits – and at a level in the stability zone that was much higher than expected.
“Gas hydrates have been studied at Cascadia for 20 years, and there has been an established model for how hydrates form on such a margin,” said IODP expedition co-chief Dr Michael Riedel of McGill University, Montreal.
“But we found from our expedition that this model is way too simple and has to be modified. We found anomalous occurrences of high concentrations of gas hydrate at relatively shallow depths, 60-100m below the seafloor.”
A brief note relevant to mid-latitude storms: Pat Michaels was interviewed on FOX news and he claimed that the Denver snowstorms mean that global warming isn’t really happening. Now, just as a thought exercise, will a moister atmosphere result in more or less snow in the interior United States in winter? Isn’t this region known for it’s cold dry winters (as skiers and snowboarders will tell you, Colorado is famous for dry powder, unlike California where the snow tends to be fairly wet).
One would have to look into the historical climate record for Denver, but as far as off-the-cuff explanations go, it seems fairly obvious that more water in the atmosphere could easily result in greater snowfall in continental interiors. Snowfall in Denver doesn’t mean global warming isn’t happening – but a lot more people are watching FOX news then are reading realclimate, unfortunately.
Some have pointed to this and have claimed that we should not expect Greenland glaciers to melt since snowfall may increase in their interiors – but it seems quite likely that the snow won’t stick around all year; it will instead melt due to the warming Arctic summers – perhaps lubricating the glaciers. Again, the data collection is critical – gravity mapping Greenland on a continual basis would help answer these questions.
Noticeably, the host of FOX news didn’t ask Michaels about melting tropical glaciers or El Nino.
[Response: I think you should expect more precipitation in general from higher T; but in some places what was snow will be rain. This also applies to Greenland: the interior should get more snow; the edges will melt more – William]
I doubt Wally would ever dismiss the models on wholesale basis. Of course models can be useful. In the linked paper he just points out a few things that are currently well out of reach of any existing model. I don’t think there is any dispute about it, is there?
This may be a basis for the notion of rapid change in the geomagnetic field’s inclination. The momentary local measurements do vary. Here’s the inclination changing by about three quarters of one degree, and the declination changing by minus six degrees, briefly, today, in one spot. That’s not the global axis changing, it’s a local event.
Sashka, you wrote ‘Alistair points out (and I agree) that the climate models are “too wrong” to be used for meaningful prediction. You can argue against it intelligently, as Rasmus did in comment to #56, but it is disingeneous to dismiss it with one sentence brush-off.’
Happy New Year and thanks for your support :-)
My point is not really that the models are too wrong for meaniful prediction, but rather that they are failing to make meaniful predictions about Mid Latitude Storms, because they are too wrong. In other words, their failure to consistently predict the change in storminess is evidence that the models are wrong. There is also their failure to account for the rapid warmings at the the start of the B-A inter-stadial, at the end of the Younger Dryas, D-O events, the surface temperature of Venus, the warm winter anomaly at the Martian poles, the length of the PETM, the runaway warming at the end of Snowball Earth, and the Faint Young Sun paradox.
Rasmus argued that the models had got some things right. That does not mean the models are correct. As Karl Popper explained, you cannot prove the models are correct, only that they are giving the right answer. If they give the wrong answer, then they are faulty. They are giving the wrong answers to the problems I listed.
Gavin also gives an example of where the models get the right answer as proof that they are correct, but if you read “Modelling an Abrupt Climate Change by Allegra LeGrande and Gavin Schmidt at http://www.giss.nasa.gov/research/briefs/legrande_01/ then they give the game away. They admit “By scaling the model’s response a little, we estimate that a reduction of about 50% in the MOC is the most consistent with the data.” Well done Allegra and Gavin! They have a model which can be made to fit with the data, but if the model is correct why does it not also reproduce the Younger Dryas? Popperism says no matter how well it fits the 8.2 ka event, if it does not fit the YD then it is faulty.
If the models are under-estimating the effects of increased carbon dioxide, and so the speed of melting the Arctic sea ice, then these models could be leading us to disaster rather than being what Coby calls useful.
In “Atmospheric Radiation” Goody & Yung writing about the source function for the Schwarzschild equation say ‘Since it is reasonable to suppose that emission is a property of matter alone, the source function … should continue to be the Planck function. This argument is fallacious, however, because, as first pointed out by Einstein, emission is also influenced by the incident radiation field (induced emission.)’ However, that argument itself is fallacious because Einstien was writing before the discovery of quantum mechanics. We now know that emission is not only influenced by the radiation field, but also by the mechanical effect of collisions from other molecules.
In short, this means that OLR (outgoing longwave radiation) depends on atmospheric pressure which only changes if the mass of the atmosphere changes, a very uncommon event. So there is no negative feedback on temperature due to an increase in OLR. It is the clouds that provide the negative feedback, but since they behave in a non linear fashion, we get abrupt climate changes.
Will my big idea be stolen? McLuhan wrote “Only puny secrets need protection. Big discoveries are protected by public incredulity.” I guess my idea is safe :-(
To an extent you are correct, there are indications of the pool migrating from west to east; however, the synoptic data in the last 30 days sems to indicate differently then the forecasts IMO, the pool of warmth to the west is diminishing or appears to be flowing towards about 140 Deg. W and 35 Deg. S. If you look at the change in the last 30 days the curves for the majority of the equatorial surface of the Pacific Heat Content appears to be recovering.
Along with this effect where there are clear increases of SSTs off the Coast of Africa near Madagascar and the rise in the open South Pacific mentioned above, interspaced between these events and continental areas there are clearly cooler SSTs developing.
Regardless of the time period of the baseline, to my understanding this happens to be the basis that NOAA applies to most of their forecasts in regards to El Nino/La Nino patterns. I will be interested in seeing what the data looks like in 30 more days. I certainly hope you and the current forecasts are wrong; however, I’ll try to keep an open mind.
Though your modial idea was further developed then my original intent in my posting, I see now in retrospect that must be the characteristic I assign to the data I have been recording from the http://nomads.ncdc.noaa.gov:9091/ncep/charts I have been reviewing. There appear to be multiple modes in which there are characteristic dominance in the ITCZ of certain pressure patterns regardless the season. However, I do not suspect that the formation of the event in the ITCZ is the driving element, they are more likely a signal of an event in the polar latitudes. (One fellow on UKweatherworld suggested a magnetic anomaly and for the recent Australian phenomena, it seemed plausible.)
I can watch the 250mb region and see a zonal pattern develop then fade into a separate pattern and then finally back again over large scales of time. The interesting thing is also watching the NH Jet Stream deviations and the Relative Humidity (categorized by pressure zone and dominate pressure pattern), differences depending on which pattern seems to be strongest.
During the recent events in southern Australia, you could see one to three Anti-Cyclonic events and could follow them down to the 850 to 1000 mb range near the Melbourne – Sydney area. When I look at the current anomaly forming in the Central South Pacific and follow the pressure zones to the surface it appears similar. I have not documented enough observations yet. However, I am interested in tying these in with the atmospheric moisture content images from the Cloud Sat/Calipso Experimental packages in the same area. (Currently if you look at the region of 140 Deg. W and 38 Deg. S and boarding either side it reminds me of the movie in which there is a string of strong events, like swirling storms on Jupiter, that set up seasonally and slowly slide across the zone until the season changes and they finaly break down. Also interesting to watch is these patterns and the variation of the Jet Stream seem to switch Poles about every 2 years. Last year it was the NH and now the NH Jet is relatively elliptical while the SH is varying wildly. (Note: This is the first time in three years I have seen the effect appear seasonal.)
I think you may have the right of it though, repeatable patterns that are difficult to anticipate or have significant distribution of peak values might very well suggest many modes with the resultant appearing chaotic or “noisy”. I wonder if it might be possible to statistically extract the signals from the data streams.
[Response: There is a whole existing literature on this sort of thing. You might start with this review by Ghil et al (2002). -mike]
Kind of like when I was working with Bell Labs and we were discussing extraction via Fourier Transforms of signals from background microwave noise.
In reply to comment 78: “Who is the source for the notion the planet’s inclination can change six degrees a day? Why do you find it believable?”
The statement: “That a six degree a day change in the earth’s magnetic field.”, is believable, because it is confirmed by data. (Two different locations. Note the field change at the Oregon change occurred during the last glacial period.)
Link to Acton’s Near Instantaneous Geomagnetic Reversal Paper
As to those who have concerns that this rapid change in the geomagnetic field is not possible if the source of geomagnetic field is the earth’s core, as it is physically impossible for a core change to have created the observed rapid field change, do not worry.
There are multiple, very basic, fundamental physics problems as to how the earth’s core could even possibly have generated the observed geomagnetic field (devil is again in the details, I can say more if you are interested). After 20 years of study, geophysicists agree that these basic fundamental problems have not been answered. Surprisingly, rather than continue to beat the same dead horse, someone has looked out of the box and has found another source for the geomagnetic field (The correct solution seems so obvious. It is hard to believe that 20 years has been spent working on geomagnetic computer models that have no basis in physical reality.)
1) Interestingly, “Sprites”, the recent change in solar behavior, and rapid geomagnetic field changes appear to be linked.
2) Link to another paper that supports the finding of 100 kyr and 41 kyr periodicity in the geomagnetic field strength.
[Response: This is purely informational to other readers, and not a desire to discuss this. However, I would simply point out that even in the abstract of the above paper they state: ‘These results suggest that the orbital frequencies embedded in the paleointensity record are the expression of lithologic variations, and probably not a characteristic of the geodynamo itself.’ – To help you translate that, it means that climate is affecting the paleo-intensity (through changes in the composition of the sediment), not that the geomagnetic field is affecting climate. -gavin]
re: 84. “In other words, their failure to consistently predict the change in storminess is evidence that the models are wrong.”
Goodness, this is complete and utter nonsense. Perhaps you do not have an understanding of the purpose and use of various models. Models in all scientific fields are not perfection and no one claims they are. Prediction models provide strong guidance as to the likelihood of occuring events. For example, in one simple sense, Gausian models provide information about the likelihood that an event will occur based on the statistical distribution about the mean. If an event is not “at the mean”, the models are *not* “wrong”. However, over time with regard to repeated events, the models will be quite accurate. Furthermore, saying the “models are wrong” over and over in various posts does not make the statement any more correct.
My views are less radical than yours. I’m not bothered by the fact that any climate model can be falsified. This is a trivial and uninteresting statement. What bothers me is the scope of falsification and implications to how we weight the models results. To me, the models are way too “wrong” … but I’m beating a dead horse already.
The bigger the number cruncher the closer we get to the “truth” (or the shorter the time to find “truthiness”).
This is true if you are solving a traveling salesman problem but not true in application to chaotic dynamics. You can use all the computers in the world in parallel and it will not take you an inch closer to predicting weather 3 weeks from now.
We may never find a solution to the maths but the success of spacecraft vivdly demonstrates we can do without it.
A wrong analogy. Even a multi-body problem is a (usually) stable and easily computationally tractable.
re: 90. There is no “dead horse” or “falsification” unless you understand how scientific models work. Both in design, purpose, and testing through repetition and through rigorous (read the IPCC reports!) scientific peer-review.
to model abrupt climate change, specifically the Younger Dryas.
The authors note:
“The results presented here provide a quantitative demonstration
of how orbital forcing, which varies smoothly
over time, can provoke an abrupt climate response. Two
regimes of ENSO behavior are identified for different
orbital configurations in which the total power, period,
and regularity of the oscillation are distinct. When the
ENSO oscillation is in transition between the two regimes,
and is weak and moderately regular, the system
can lock to the period of the forcing… This behavior
recurs on an approximately 11-kyr timescale,
when perihelion occurs either during boreal winter or
They further note:
“We note that the modern ENSO (zero forcing) is close to
the transition period during which abrupt ENSO shutdowns
can occur, suggesting that currently ENSO may
be fairly sensitive to external forcing.”
Thanks, that all makes a bit more sense now, but im still not getting it yet.
http://www.realclimate.org/index.php?p=142 gives a figure for attribution of co2 to the total greenhouse effect (33C) somewhere between 12-25%, which is about 4-9C. Surely this constrains the maximum of climate sensitivity quite severely if a logarithmic relationship of a 4.5C rise per doubling co2 holds for even a small part of the co2 rise.
70-280ppm = 4.5C warming
140-280ppm = 4.5C warming (9C warming so far..whoops co2 accounts now accounts for all the greenhouse effect!)
I guess that I am missing something crucial (this is my bet) or the logarithmic relationship is only a generalization that holds for triple digit concentrations and before that doublings produce lower temp rises as was suggested.
I will try out this climate model #78 linked to, although im doubtful I will figure out how to use it for this question
[Response: If you compare the radiative forcing for a doubling of CO2 now is is around 4 W/m2. If you calculate the forcing removing all CO2, then it’s around 20 W/m2 i.e. around 5 times as much. If you actually run a model with this kind of change then temperatures drop enormously (due to the water vapour feedback) – though how much faith you would put in such a simulation is unclear. It is likely therefore that at small values the increase of forcing with CO2 is greater than logarithmic – and you could check that with a real line-by-line model. -gavin]
RE#85: “Regardless of the time period of the baseline (1971-2000), to my understanding this happens to be the basis that NOAA applies to most of their forecasts in regards to El Nino/La Nino patterns” – so what were they using prior to the record 1997-1998 El Nino year?
It may be possible that El Nino conditions are weakening, but not vanishing, according to Australian forecasters. Notice the use of clear and easily understandable language – they are trying to explain the situation, not to confuse it:
“Summary: El NiÃ±o maturing
Mature El NiÃ±o conditions continue to dominate the equatorial Pacific Ocean. Ocean surface temperatures have been steady over the past fortnight at somewhat more than 1Â°C above average right across the central and eastern equatorial regions, and cloud patterns generally show a classic El NiÃ±o structure. Computer model guidance continues to suggest that Pacific Ocean temperatures, and hence the El NiÃ±o, may peak around January or February 2007. This timing would be consistent with the breakdown of past El NiÃ±o events.
However, there are a few signs that the event may have already started to weaken: the SOI has only been weakly negative for more than a month; the Trade Winds in the western and central Pacific have strengthened to near-normal values in December; and sub-surface temperatures show a weakening of east-Pacific warmth and a strengthening cool signal extending from the west.”
This means that the El Nino may not be as strong as the record 97-98 one – but still present.
Given all the political maneuvering at NOAA, as well as the attempts to silence senior scientists, as well as their peculiar choice of a ‘baseline’ (1971-2000) for calculating anomalies – I’d take their forecasts with a large grain of salt.
I’d like to see the climate denialists answer a simple question: do they think that record-breaking temperatures will continue to pop up all over the planet in the coming years, or not? More to the point, when do they expect to see this trend end, and on what grounds do they believe that? My prediction is that you won’t see an end to new record temps until atmospheric CO2 levels are stabilized – but that won’t happen until we replace fossil fuels with renewable energy systems.
Re #89 If the one model says that the storms in the North Atlantic are going to increase and another says they are going to reduce, then I think it is quite fair of me to say that some models are wrong.
If some models say that the the climate sensitivity is 1.5K and other models say that the climate sensitivity is 4.5K, and the rest say somewhere in between, only one value is correct. Only one out of those dozens of models is correct, and I am entitled to say the vast majority of models are wrong!
If they now clustered around 3.3K to 3.7K then I would accept them as correct, but they have remained with a variation of over 300% for 15 years. It seems to me obvious that there has been something wrong with the models for at least that time.
The climate models are not calculating a Gaussian distribution of climate sensitivity. They each calculate an exact value. Their results do not change as a result of random variation, as would happen in a Gausssian distribution, but due to the use of different sets of parameters eg. Gavin’s about 50%!?
I may be saying that the models are wrong as often as John Cleese said “This parrot IS dead!” in the Monty Python sketch, but that does not mean that he or me is not correct! It only means that … some things have to be said loud and often if they are going to be believed.
You have to realise that just because an article is peer reviewed it is not neccessarily correct. There were plenty of articles opposing continetal drift before plate tectonics was accepted, and plenty of papers criticising the idea of an impact causing the dinosaur extinction before the Chicxulub crater was found.
The problem with climate models is that the Schwarzschild equation was introduced by Robert Emden for terrestrial radiation (it was desgned for solar radiation) before peer review was invented, but now we are stuck with it because it is “part of science.”
Planck’s function would not be used as the source function today now that we know about radiationless relaxation.
RE: #91 – Do *YOU* understand how the models work? By this, I mean, do you understand the algorithms, the mathmatical basis for the model elements, and, most significantly, the propagation of error terms? Do you truly understand all of this to a degree that would allow you to state some assessment of falsifiability?
Thanks for the reference, I have only gotten through the first 8 or so pages and will clip a copy for further study. I am ashamed that the majority exceeds my abilities; however, the basic premise so far that I find interesting is the reliance on harmonics. The idea being if you lined up a number of bells, hit each of them at the same time with one clapper, and then analyzed the resultant for the original pure tones. The intent was to identify each time the pressure wave passed the null point and then try to associate its partner.
This works well for a symmetrical source like a bell or a object with consistent size and density. At issue is what happens if the source of the signal is asymmetrical, what if the sources are like different shaped pieces of metal with some having sound absorption foam on one side or at different ends, in short, harmonics would not be valid in the analysis. As to the discussion of lag and the Fourier Analysis that again would be appropriate if there was a known forcing and that the energy would be transferred to secondary sources at a given or known time and either reflected or resonated by the secondary source.
I agree I need to read further and it may be revealed, so please forgive my ignorant ramblings. It is unlikely that the forcing energy source is “free floating” in a natural system, hence it should resonate. By the same token, the likelihood of the forcing energy to not invoke a symmetrical expansion and contraction as in a pressure wave is also unlikely in a natural system. I will continue my research, thanks for the start!
RE: #91 – Do *YOU* understand how the models work? By this, I mean, do you understand the algorithms, the mathmatical basis for the model elements, and, most significantly, the propagation of error terms? Do you truly understand all of this to a degree that would allow you to state some assessment of falsifiability?
Oh for goodness sakes. Yes, it happens to be part of my job. And “*YOU”? Do you understand the statistical significance of propagating errors? And how they are accounted for? Do you truly understand the complete overstatement of “falsifiability” that is being made here?
re: 97. “You have to realise that just because an article is peer reviewed it is not neccessarily correct. There were plenty of articles opposing continetal drift before plate tectonics was accepted, and plenty of papers criticising the idea of an impact causing the dinosaur extinction before the Chicxulub crater was found.”
This statement indicates a misunderstanding of the peer-review process. The *process* is a very strong, difficult and accurate one. Sorry but science is never perfect as no one claims it is. But cherry-picking counter examples is the method that global warming denialists have used for years. Furthermore, the quantity of articles on a subject (e.g. continental drift vs. tectonic plates) is utterly and completely irrelevant. The process of scientific peer-review led us to where we are with regards to our knowledge of tectonic plates. The scientific process involves hypotheses, gathering data to test those hypotheses, experimentation, analyzing data and results, conclusions, peer-review including the ability to repeat the experiments which produce similar conclusions, and new hypotheses for testing. In the case of global climate change, the scientific consensus is quite strong.
re: 96. I used the simple Gaussian model distribution as a simple case to make a simple point about models, as I stated. I never said the models were calculating a Gaussian distribution. Please do not infer that I did.
If the climate sensitivity is 1.5K and other models say that the climate sensitivity is 4.5K, and the rest say somewhere in between, that does *not* mean only one value is correct. It does indicate a range of sensitivity. You may be “entitled” to say anything you like but you would be simply wrong to say “the vast majority of models are wrong”.
Shouting “THE MODELS ARE WRONG!” again and again does absolutely nothing to prove that. If anything, it detracts. Thankfully we have documented scientific processes and thorough review processes that show otherwise.
In Reply to comment 82: “This may be a basis for the notion of rapid change in the geomagnetic field’s inclination. The momentary local measurements do vary. Here’s the inclination changing by about three quarters of one degree, and the declination changing by minus six degrees, briefly, today, in one spot. That’s not the global axis changing, it’s a local event.”
There are two events recorded in lava flows. They were not local events and were not temporary magnetic field changes. A NOVA program was produced to discuss, the Oregon rapid magnetic field change which was a very large change in field inclination (the field has trying to reverse), rather than a permanent reversal. The Afar event was a magnetic field reversal. (See my comment 88 for links to the two papers.)
Please defer dismissing this data, until I can present an explanation of what could possibly account for this extraordinary rapid change in the geomagnetic field.
From the 2002 Nature Paper Abstract which discusses the Oregon partial reversal:
“Palaeomagnetic results from lava flows recording a geomagnetic polarity reversal at Steens Mountain, Oregon suggest the occurrence of brief episodes of astonishingly rapid field change of six degrees per day. The evidence is large, systematic variations in the direction of remanent magnetization as a function of the temperature of thermal demagnetization and of vertical position within a single flow, which are most simply explained by the hypothesis that the field was changing direction as the flow cooled.”
From the abstract of Acton’s paper (see my comment 88 for a link to Acton’s paper for details.) that discusses the Afar event:
“One lava flow has recorded both of the antipodal transient components residing in magnetic materials with unblocking temperatures above and below 500C (my comment, Curie temperature) respectively … Hence the geomagnetic field appears to have jumped nearly instantaneously from a north-hemisphere transitional state to a southern hemisphere one, during the normal to reversal polarity transition.”
#69 Ike, fully agree on moisture, rather water vapour pressure in air, the dynamics may be relatively simple, the more water vapour the more greenhouse effect there is. This is perhaps the #1 strongest feedback in the Arctic, with higher temperatures there is more water vapour, the Arctic is now becoming from a very dry cold place, a warmer wetter environment. There is no question about Hadley cells being more complicated, but a normal North Pole would have a huge High pressure almost at all times somewhere during the long night, this is the world’s roof top cooling system which has been hampered by a recent increase in clouds extent.
Alistair, you should consider the models like prediction instruments, rather than a perfect zillion crunching of calculations giving the coming climates. In Climate projections, the ultimate peer is not a Journal referee, nor an Einstein, the ultimate peer is the future. Models in my opinion are superb tools, making sense of a huge planet, they lack a few algorithms, which will be found out eventually. My opinion of them is that they perhaps are not designed in calculating the total heat content of the Earth’s atmosphere, a flaw if that is not done. I use other methods in determining the total temperature of the atmosphere in key locations, and found great success in “seeing” the future. Eventually the models will incorporate this idea and many others, wait a little, they will surely be more impressive.
Rasmus wrote “In a recent paper by Bengtsson & Hodges (2006), simulations with the ECHAM5 Global Climate Model (GCM) were analysed, but they found no increase in the number of mid-latitude storms world-wide. Another study by Leckebusch et al. (2006) showed that the projection of storm characteristics was model-dependent.”
Two peer reviewed papers, not cherry picked but found in this this blog item, which say the opposite. Only one can be correct, and only time will tell which is correct, and then it will be become part of scientific truth. Until then, both papers are in limbo despite being peer reviewed. Publication does not make a paper true. It only makes it a candidate for being part of science.
Therefore it follows that the climate models are not neccessarily true just because they have been published in peer reviewed papers. They will only be true when they are found to match reality. At present they dont, and arguing that the data is wrong and the models are right is even more stupid than shouting THE MODELS ARE WRONG!
Re “If some models say that the the climate sensitivity is 1.5K and other models say that the climate sensitivity is 4.5K, and the rest say somewhere in between, only one value is correct. Only one out of those dozens of models is correct, and I am entitled to say the vast majority of models are wrong!
If they now clustered around 3.3K to 3.7K then I would accept them as correct, but they have remained with a variation of over 300% for 15 years. It seems to me obvious that there has been something wrong with the models for at least that time.”
You’re assuming the distribution of doubling predictions is flat over the entire range. In fact they do cluster around 3.0 degrees. Try taking the figures for 20 or so models and graphing them in a histogram and you’ll see what I mean.
Well, today Iâ��ve found something, a news which would surely move youâ�¦ will push you to think about it and ask others too to do the same. I can bet that! However, please lemme know, whatever do u think abt this mail on my linksâ�¦ awaiting your earliest response. And yes! Do suggest me if you think, it will help all and me. Please, I would request you here, to mail this stuff to all the people you know (your friends and relatives) and make them work on thisâ�¦ believe me â�� this thing wonâ��t only help you or me but to your children and grandchildren and to all the generations to come.
WWF website states: “By 2100, there may be no ice left in the Arctic in the summer and that means no polar bears. It explains, â��Global warming, caused by fossil fuels – is all to blameâ��. Sources say, â��Global Warming could cause Mass Extinctions by 2050!â�� Can we afford to lose these? Join me to call the world for help.
Global warming â�� â��An observed increase in the average temperature of the Earthâ��s atmosphereâ��. This is rapidly raising the sea levels. Now, the world’s greater environmental challenge is fast becoming too hot to handle. Moreover, close is the world to disaster with ozone depletion. Every single individual will suffer the outcomes of Global Warming.
Global warming is a threat that has gone unchecked for too long, a riveting story of climate change that has reshaped our planet’s evolution & a bracing scenario of catastrophes brewing in the future. The changes are happening largely out of sight; all the same, they shouldnâ��t be out of mind. These aren’t projections; they are facts on the ground. We are running out of time. All the facts demand urgent solution, & policymakers need to act before the problem gets out of control. It is not too late yet.
Now, you must be thinking â�� What can I do from my side, Right? Well, there are many simple things you can do, which can have an instant effect on your immediate surroundings & on places as far away as Antarctica too. Here is a list of few things that you can do to make a difference!
1. Replace frequently used light bulbs with compact fluorescent ones.
2. Make sure your printer paper is 100% post consumer recycled paper.
3. Keep the tires on your car adequately inflated and change carâ��s air filter monthly. Better, get a hybrid car or a fuel-efficient car and carpooling with friends & co-workers helps too.
4. Move your heater thermostat down 2 degrees in winter & up 2 degrees in the summer and keep your water heater thermostat no higher than 120Â°F.
5. Clean or replace dirty AC filters and get wind certificates & green tags. Buy products locally & reduce the amount of energy required to drive & yes purchase minimally packaged goods to reduce garbage. Purchase organic food & use cloth bags instead of plastic or paper bags to reduce waste. Weatherize your home & plant trees.
6. Inefficient old appliances waste energy, replace them. Electronic devices use energy even when turned off so unplug them.
7. Insulate your Home & switch to double pane windows to keep more heat inside your home. Air-dry your clothes instead of using the dryer.
You will find more information on http://www.weblobbying.com/Lobbies/default.asp?Lid=474 .
Well, fact is that we are running out of time! Is the society aware of the seriousness of the global warning called Global Warming? The changes are happening largely out of sight. Nevertheless, they shouldn’t be out of mind, because they are omens of what’s in store for the rest of the planet. To alter fundamentally the trajectory of global greenhouse emissions, it is going to take a tremendous & concentrated global effort. So, make your move, step forward & contribute your share at http://www.weblobbying.com/Lobbies/default.asp?Lid=474
Our children, grand children, & generations to come will bear the consequences of choices that we make, now. We may perhaps destroy whole nations & their cultures that have existed for thousands of years! Can we afford to lose the precious gift of Wild life of varied species on poles and do you think we are prepared to suffer the consequences?
Please help save our Mother Earth from the disastrous effects of global warming before its too late! It is our hope that your action, awareness & empowerment can make a difference & help stop global warming. Take Action now!
I believe it’s perfectly senseless to turn this already high-noise conversation into a shouting match. Some people will continue to repeat “peer review” and “IPCC” for the same reason that they go to church on Sunday.
It’s funny that people who never worked in academia are teaching the audience that “peer review process is strong”. Anyone who did participate in it knows full well that it isn’t.
I have personally seen Ph.D. theses being approved by advisors and committees simply because the student hang out in grad school for too long to kick him/her out without a degree. And all this garbage, needless to say, gets published in a refereed literature. For you information, a referee isn’t required to check the math in details. If the author dropped a coefficient somewhere in derivation, it will usually go through just fine. When I took pains once to go in details and found an error I got an incredulous reply from the author: “It was a part of my thesis and /a world-famous scientist/ approved it.” Well … duh! He had better things to do than to go into political fight with the department. The editor wanted no part in this and killed the paper. Keep in mind that it’s a rare exception, though!
But this is nothing compared to the bugs in the computer code. Those are practically uncheckable, certainly not in the peer-review process. Even apart from it, the bugs live undisturbed for years and decades. A well known researcher told me once that while on postdoc he found a bug in GFDL model. His boss (a big name) simply declared that the bug didn’t exist and that was that for quite a while. It took about 10 years until someone else noticed and finally fixed it.
That’s climate. My friend who works in the area of solid state physics tells me that in his field over 50% of published peer-reviewed results are later proved wrong. Virtually anything can be published. Alan Sokal famously demonstrated it:
[Response: The Sokal hoax was in a social science journal, not a scientific journal and using that as an example of how scientific peer review doesn’t work is misleading. However, I do agree that crap gets published all the time – we discussed examples here – but it is not unusual for serious mistakes to be found during the peer review process and for terrible papers to sink without a trace. Your example is not isolated (ask any journal editor). Finally, as someone who runs a model development group, I can assure you that any bugs found in our code are tracked down and the impacts checked immediately and I would be astounded if that wasn’t true at GFDL. The idea that ‘big names’ go around squishing younger researchers’ corrections to code is not one that has any credibility, I’m afraid. -gavin]
re: 105. The “number of mid-latitude storms world-wide” and “storm characteristics” are not the same. And of course simply publishing is not the end-all. The entire point is the review within the scientific community by knowledgable peers, not laymen. It really is not a difficult concept.
re: 110. A small minority of vocal laymen will continue to repeat with no basis that they know more than literally thousands of climate science researchers who have followed a strong and proven review process for the same reason that they beleive they *must* be right whatever the scientific evidence shows or whatever “church” they attend.
True, Sokal published his hoax in the social science journal. I just couldn’t resist bringing it up :-) I didn’t mean to say that peer review process doesn’t work at all. The point was that it’s very much flawed so only a layman could think of it as “strong”.
I’m not at all questioning professional standards in your group. But you are still developing. However, 15-20 years ago GFDL was the only freely distributed model, it was released, effectively a gold standard. The incident in question didn’t occur in Princeton (can’t imagine Kirk Bryan to behave like this) but elsewhere. I’m sure that you’re on the first name basis with the junior figure in this episode so you can pick up the phone and verify it in a matter of minutes. I cannot guarantee that will go “on record” with this, though. [edit – email me directly if you like – gavin]
The new data are “direct aircraft observations … on the chemistry occurring downwind of convection … previously only the province of model analyses. .. quantitative measures that can be used to evaluate global climate and chemistry models.”
Re “My opinion of them is that they perhaps are not designed in calculating the total heat content of the Earth’s atmosphere, a flaw if that is not done.”
Use the specific heat capacity of moist or dry air, depending on how much water each layer of air can hold, and find the temperatures at each altitude from the NOAA et al. (1976) US Standard Atmosphere. Heat energy in a reservoir is temperature times heat capacity times mass. For mass of each layer take the height and density figures, figure a volume and a mean density to get the mass. The numerical answer is left as an exercise for the student.
Re #95 Ike wrote “I don’t hear Alastair crying out about the failure of weather models… why not?”
I don’t cry out about the weather models because it is impossible to get the weather wrong! All you have to do is wait until just before it arrives and then predict it. Of course if you only wait until 24 hours ahead, then you get it right most of the time. If you get your model to predict it 96 hours ahead and it gets it wrong, then you blame chaos theory. I can’t win saying that the weather models are wrong, but they are.
The Met Office produce a Handbook for their weather forecasters and it explains that the cloud base is not where the models predict. In fact there is a paper with a coauthor who has posted here which explains if you rely more on the measurements and less on the model then the results are better!
Craven, Jeffrey P., Jewell, Ryan E. & Brooks, Harold E. (2002) “Comparison between Observed Convective Cloud-Base Heights and Lifting Condensation Level for Two Different Lifted Parcels” Weather and Forecasting 17 pp. 885-890.
Of course it is easier for the climate modellers. No one yet can tell whether they are right or wrong.
#116, That would be nice daily number to have, the total heat content of the Northern and Southern Hemisphere’s atmosphere and oceans. Only then will we have an idea about a true GW trend.
#117. Alastair, I agree that some models are practically useless, like temperature Probalistic long term projections. which are worse than flipping a coin. But the models actually mimic quite well huge systems, but fail in micro managing exact scenarios because their resolution is poor. A met station may record no upper inversion, while 20 Kilometers away there is one, this is not a small problem, the models are in effect set-up with an incomplete pack of data, and they can’t “imagine” what is happening between station gaps.
Re #119 Wayne, the models cannot replicate entry into and exit from the Younger Dryas, hemispherical events. Yet a paleoclimatologist has told me all that is needed to use models with a higher resolution!
Alastair, my calculations show, by a small formula called EROAM, equivalent refraction one atmosphere method, a significant gradual yearly warming in the horizontal, not mainly in the vertical, where there is just 35 km of relatively dense atmosphere, compared to 25,000 kilometers of higher pressure air everywhere you look at the horizon. I don’t blame the models if they don’t have this information, they are in some ways blind sided, by too much information in the vertical without upper air verification from the horizontal. But that will change some day…..