Traducido por Angela Carosio
Unas semanas atrás estuve en una reunión en Cambridge donde se discutió como, o si, la información paleo climática puede reducir las incertidumbres conocidas en simulaciones climáticas futuras.
Las incertidumbres en el impacto que provoca el aumento de gases de invernadero en múltiples sistemas son significativas: el potencial impacto de ENSO Oscilación del Sur, El Niño (El Niño Southern Oscillation, ENSO, por sus siglas en inglés), o el vuelco en la circulación del atlántico norte, probable información sobre la composición atmosférica (CO2, CH4, N2O, aerosoles), la posibilidad de pronosticar los cambios climáticos por décadas la sensibilidad misma del clima global y, quizás la más importante es que sucederá con las capas de hielo y el régimen de lluvias regional en un clima más cálido.
La razón por la cual la información paleo climática puede ser clave en estos casos es que todos estos componentes climáticos han cambiado en el pasado. Si se pueden entender el cómo y el porqué de estos cambios en el pasado, se pueden modelar nuestras proyecciones de cambio para el futuro. Lamentablemente, el simple uso del registro, ir hacia atrás hasta un punto que tenga condiciones similares a las que esperamos para el futuro no funciona muy bien, ya que no hay buenas analogías a los trastornos que estamos causando. Nunca se ha visto antes un aumento tan veloz de gases de invernadero con la presente configuración de los continentes y vastas cantidades de hielo polar. Por ende, se deben desarrollar enfoques más sofisticados y esta reunión estuvo dedicada a examinar dichos enfoques.
La primera observación que se puede hacer es muy sencilla. ¡Si algo sucedió en el pasado, entonces eso significa que es posible! Así, cambios en ENSO, capas de hielo y los ciclos del carbono, por ejemplo, demuestran claramente que estos sistemas son realmente sensibles a cambios externos. Por lo tanto, asumir que no pueden cambiar en el futuro sería ingenuo. Esto es básico, pero no verdaderamente útil desde un punto de vista práctico.
Toda proyección futura cuenta con modelos de algún tipo. Lo que domina en el tema climático son los GCMs (Global Climate Model, GCM por sus siglas en inglés) modelos climáticos globales océano atmosféricos a gran escala, que fueron discutidos extensamente en el último informe del Panel Intergubernamental del Cambio Climático, conocido por sus siglas IPCC (Intergovernmental Panel on Climate Change en inglés), pero también se pueden usar otros modelos más simples, más especializados o más conceptuales. La razón por la que estos otros modelos son aún útiles es que los GCMs no son completos, no contienen todas las posibles interacciones que conocemos de los registros paleo climáticos y pueden ocurrir observaciones modernas. Segundo, las interacciones observadas en los registros como, por ejemplo, entre los niveles de dióxido de carbono o cantidades de polvo y el forzante de Milankovitch, implican que hay un mecanismo que los conecta. Estos mecanismos pueden ser solo imperfectamente conocidos, pero los registros paleo climáticos resaltan la necesidad de cuantificar dichos mecanismos para que los modelos sean más completos.
Tercero, y probablemente el tema más importante, es que los registros paleo climáticos son útiles para evaluar los modelos. Todo episodio en la historia climática, en principio, debiera dejarnos cuantificar cuan buenos son nuestros modelos y cuan apropiada es nuestra hipótesis sobre cambios climáticos en el pasado. Sin embargo, es vital tomar nota de un detalle, los modelos incluyen muchos más datos y suposiciones sobre cómo funciona el clima, pero para que el clima de los modelos cambie se necesita una hipótesis, como podrían ser un cambio en la órbita terrestre, actividad volcánica, cambios en el sol, etc. Comparar simulaciones de modelos climáticos con datos observados se convierte en una evaluación de los dos factores juntos. Aún cuando la hipótesis sea que los cambios se deben a variaciones intrínsecas, una simulación en un modelo buscando la magnitud de cambios intrínsecos (posiblemente debido a múltiples estados constantes o similar) sigue siendo una prueba de evaluación tanto para el modelo como para la hipótesis. Si la evaluación fracasa, es una muestra que uno o algunos elementos (o ambos) son deficientes, o que la información está incompleta o mal interpretada. Si la evaluación es exitosa, tenemos una explicación consistente de los cambios observados que puede, no obstante, no ser única, pero es un buen punto de partida.
Pero, ¿Cuál es la relevancia de estas evaluaciones? ¿Qué es lo que un modelo sobre impactos de cambios en el vuelco de la circulación oceánica del atlántico norte, o cambios en la órbita terrestre, puede realmente ofrecer para proyecciones futuras? Aquí se dirige la mayor parte de la atención. La clave desconocida es, si la habilidad de un modelo en una cuestión paleo climática esta correlacionada con la magnitud del cambio en una situación. Si no está correlacionada, por ejemplo, las proyecciones de los modelos paleo climáticos que dieron buenos resultados abarcan la misma esfera de acción que los modelos que dieron malos resultados, entonces no se ha ganado mucho. Sin embargo, si se puede demostrar que los modelos exitosos en, por ejemplo, cambios en el régimen de lluvias a mediados del Holoceno, dieron sistemáticamente proyecciones diferentes, por ejemplo, a grandes cambios en el Monzón índico bajo condiciones de gases de invernadero crecientes (Green House Gases, GHG por sus siglas en inglés), entonces se tendría una razón para sopesar las diferentes proyecciones de modelos para llegar a una evaluación revisada. Asimismo, si un modelo no puede coincidir con el rápido derretimiento de las capas de hielo en un período de deshielo, entonces se debiera disminuir la credibilidad al proyectar futuras estimaciones de derretimiento.
Desafortunadamente, aparte de unos pocos experimentos coordinados para el último período glacial y mediados del Holoceno (por ejemplo PMIP, Paleoclimate Modelling Intercomparison Proyect, por sus siglas en inglés) con modelos que no necesariamente superponen con los que se encuentran en el archivo AR4 (Fourth Assessment Report, AR4 por sus siglas en inglés), no existe una base de datos de resultados y evaluaciones. Se han mirado varios eventos paleo climáticos en modelos individuales, desde la Pequeña Edad de Hielo al Cretáceo, pero esto solo sirve como grupo de exploración adelantada que determina solo un trazado en la tierra en lugar de la hoja de ruta completa. De modo que nos enfrentamos con dos problemas: todavía no se sabe qué eventos paleo climáticos serían los más útiles (aunque todos tienen ideas propias), y no se tiene la base de datos que permitiría hacer coincidir las paleo simulaciones con las proyecciones a futuro.
Cuando se miran los registros paleo climáticos para sacar de ellos modelos de evaluación útiles, hay dos tipos de problemas: lo que sucedió en un período específico, y cuáles son las respuestas a un forzante o evento específico. Para el primero se requiere una descripción completa de los diferentes forzantes en el período específico, para el segundo, una recopilación de datos abarcando muchos períodos de tiempo asociados con un forzante. Un ejemplo del primer enfoque sería el último período glacial máximo donde se debieran incluir, por lo menos, los cambios en la órbita terrestre, los gases de invernadero, polvo, capas de hielo y vegetación. La segunda clase se caracteriza por su búsqueda de respuestas a erupciones volcánicas mediante el agrupamiento de todos los años posteriores a grandes erupciones. Se podrían desarrollar enfoques similares para la primera clase para el período Plioceno medio, el evento conocido como 8.2 kyr, el período Eemian (también conocido como Riss Würm, último período inter glacial), principios del Holoceno, la desglaciación, principios del Eoceno, el PETM (Paleocene-Eocene Thermal Maximum, PETM por sus siglas en inglés), la Pequeña Edad de Hielo, etc, y para la segunda clase, forzantes orbitales, forzantes solares, eventos de Dansgaard-Oeschger, eventos de Heinrich, etc.
Pero todavía falta un elemento. En la mayoría de los casos nuestro conocimiento sobre cambios en estos períodos de tiempo es fragmentario, está repartido desde docenas a cientos artículos, y sujeto a múltiples interpretaciones. Resumiendo, es una situación muy complicada. El elemento que falta es poner toda esa información junta y producir un resumen que sea fácilmente comparable con los modelos. El hecho que este resumen se hace solo en raras ocasiones hace hincapié en las dificultades que entraña. Hay buenos ejemplos de ello, CLIMAP (Long range Investigation, Mapping and Prediction, CLIMAP por sus siglas en inglés) y su reciente actualización, MARGO (Multiproxy approach for the reconstruction of the glacial ocean surface, MARGO por sus siglas en inglés) por las temperaturas oceánicas del último período glacial máximo(LGM, por sus siglas en inglés), las bases de datos de precipitación y vegetación para el período Holoceno medio en PIMP, especialmente la resolución de los patrones de temperatura de los últimos cientos de años de múltiples servidores porxy, etc. Se han utilizado con mucho éxito cada uno de estos sistemas en modelos de comparación de datos y han sido muy influyentes dentro y fuera de la comunidad paleo climática.
Parece raro que este tipo de estudio no se realice con mayor frecuencia, pero tiene sus razones. Fundamentalmente se debe a que las herramientas y las técnicas requeridas para obtener un buen resumen no son las mismas que para hacer mediciones o desarrollar modelos. Se podría describir como un nuevo tipo de ciencia (aunque en la ciencia no sea nuevo para nada) que requeriría, quizá, de un nuevo tipo de científico, alguien que se sintiera cómodo con las distintas fuentes de datos paleolíticos y que fuera consciente de los problemas, y aún así fuera consciente de lo que los modelistas necesitan y por qué lo necesitan. O, por otra parte, se necesitarían personas que trabajen en modelos que entiendan de qué dependen los datos indirectos y quién puede desarrollarlos en los modelos, haciendo de este modo más directa la comparación de datos con los modelos.
¿La comunidad paleolítica debiera, entonces, incrementar el énfasis en la síntesis y asignar más fondos y posiciones consecuentes? Esto es, a menudo, una cuestión polémica, ya que cuando se discute la necesidad de trabajadores para integrar la información existente, algunos se preguntan si la primacía de la recopilación de datos nuevos está siendo amenazada. Esta reunión no fue una excepción. Sin embargo, estoy convencido que este debate no es la suma cero implícita en el argumento, sino todo lo contrario. Resumir la información desde un campo altamente técnico y hacer dicha información útil para otros fuera del campo es fundamental para aumentar el respeto por este campo de estudio y así incrementar la cantidad de fondos de investigación disponibles a largo plazo. Sin embargo, la falta de gente capacitada que podría ganarse el respeto de los recolectores de información y que podrían entregar un producto con valor agregado a los encargados de ejecutar los modelos presenta un serio obstáculo.
A pesar de los problemas y de los incuestionables desafíos que representa llevar datos paleolíticos y comparaciones de modelos a un nuevo nivel, fue alentador ver que estos temas fueron abordados de frente. El deseo de tirar líneas para solicitudes de subvención para la ciencia real fue muy inspirador, tanto así, que yo debiera dejar de escribir blogs y dedicarme a ello.
La versión condensada de la reunión está altamente influenciada por conversaciones y charlas de la misma, particularmente con Peter Huybers, Paul Valdes, Eric Wolff y Sandy Harrison entre otros.
Gusbobb (117), as a fellow skeptic whose skepticism lies, in part, in the same area, I’ll try to clarify some of your simpler questions. A CO2 molecule absorbs IR radiation only in discrete frequencies, though there are large numbers of lines within the primary CO2 absorption band of 15microns, +/- 2. These frequencies are based on the quantum energy levels of bond vibration, and are somewhat problematic ala quantum mechanics. This absorption does not increase the real temperature of CO2 molecules.
The re-emission of energy from vibrational energy will also be within the same frequency lines, though because of the Doppler effect may appear to be a different frequency to a subsequent CO2 molecule. [A similar effect can apply to the initial CO2 molecule absorbing primary radiation.] Re-emission stems from the molecule’s tendency to return to its internal equilibrium ala equipartition. More likely, especially at lower denser altitudes, the CO2 molecule will relax (lose) its absorbed vibration energy to another molecule’s translation energy (1/2mv^2) via collision. The collision is most likely with the predominant O2 or N2 molecules and will evidence a temperature rise in the collidee – produce atmospheric heating. Other stuff can also increase the atmosphere’s temperature.
O2 or N2 will do its common transferring energy back and forth through collision (including an occasional translation transfer to, say, CO2), spreading the wealth, as it were, per Maxwell. It might also relax its translation energy through emission – up or down, but this emission is of Planck’s blackbody nature, not the discrete IR absorption/emission limited to vibration and/or rotation energy levels/modes unique to CO2, methane, H2O, etc. (but which can also emit blackbody-like.) Escaping radiation comes from GHG re-emissions at high sparse altitudes, atmospheric blackbody radiation, or some that sneaks its way directly from the surface.
If I’ve erred here we’ll know soon enough.
Hank (124), I always wondered how you did an RC cite. Thanks! Now if you’ll explain how to include smiley faces, I’ll be happy…. though maybe not Gavin
Rod B #151:
Yep ;-) (made by just typing a text smiley inside blanks, and verifying with preview)
Only your last paragraph requires a further comment. You write
True, but. At Earth atmospheric densities N2, O2 etc. are practically transparent in the thermal IR, which means according to Kirchhoff-Bunsen that they emit very little. It’s really the greenhouse gases doing the job of both absorbing and emitting, but in local thermodynamic equilibrium with the non-greenhouse gases. Kirchhoff-Bunsen says that the emission intensity for a frequency is equal to the product of Planck’s curve (for that frequency and local temperature) and absorptivity of the air parcel in %.
When you look down at the atmosphere from above, you see what is piecewise a Planck black-body radiation curve. Not as a whole. The reason is that for some frequencies, e.g., within the 15µ absorption band, the radiation seen by the satellite comes from a level 10 km or more above ground, where it is cold; whereas at other frequencies (outside absorption bands) we are looking straight down to the ground which is much warmer. So the net result is a “chimera” of Planck curves emitted at different temperatures.
It is worth playing with Dave Archer’s software. It shows (as a 1-D model simulation) what gets radiated to space. If you remove everything but CO2, you’ll see how around wave number 700 cm-1 (i.e., 15µ) radiation comes from a level high up where temps are 220K (yellow Planck curve), i.e., the tropopause and above. Outside the band, radiation comes from the level of 297K (bit below the green Planck curve) which is the tropical ground temperature. Etcetera. So you look down to different levels depending on frequency. This is only a model, but real measurements from real satellites looking like this exist too.
Rod,
One nit. You continue to distinguish between “blackbody” and quantum radiation. There is no distinction. Quantum interactions with surrounding matter are simply the mechanism by which the radiation field comes into equilibrium with itself and with the surrounding matter. Of course the radiation field can only interact with the materials that have absorption bands that overlap its wavelengths. For CO2, that band is around 15 microns. No matter what you do, you won’t get CO2 to radiate outside this band (you can distort the band, but only so far), and diatomic gasses such as N2 and O2 will only radiate as a result of a collisional interaction that alters their magnetic symmetry.
Re #151
Hi RodB,
You do smiley faces by typing : – ) without spaces :-) and sad faces by typing : – ( :-(
You wrote “This absorption does not increase the real temperature of CO2 molecules” which is not strictly correct. I am assuming that by “real temperature” you mean kinetic energy of the molecules. All gas molecules have this type of energy, and that is what is measured with mercury thermometers. The absorption of infrared radiation by greenhosue gases causes these molecules to be vibrotaionally excited, and much of this energy is lost to other air molecules through collisons raising the kinetic energy (real temperature) of the air. The warmer air molecules then share their kinetic energy with the CO2 molecules so raising the real temperature of the CO2.
CO2 does not emit Planck black body radiation. Only solids and liquids do that. But the CO2 vibration and vibrotation lines tend to merge and produce a band emission to space which can look like a segment of the Planck function.
HTH,
Cheers, Alastair.
Re #151
Rod B writes
“These frequencies are based on the quantum energy levels of bond vibration, and are somewhat problematic ala quantum mechanics.”
What is problematic about them?
Re #153
Martin,
You wrote:
“Kirchhoff-Bunsen says that the emission intensity for a frequency is equal to the product of Planck’s curve (for that frequency and local temperature) and absorptivity of the air parcel in %.”
Can you give me a reference for that fact? Kirchhoff died in 1887, and Bunsen in 1899 but Planck did not publish his function until 1900, although one was already thought to exist. As I understand it, it was not until 1915 that Einstein showed that gases obey Kirchhoff’s law.
Cheers, Alastair.
RE #134 & your claim that scientists are arrogant, acting as if they know it all.
Let me go over (again) my gripes with both science and denialism: science requires 95% confidence there’s a link before making a claim (I also demand that in the social science research I and my students do); and denialists require either 99% or 101% confidence.
Scientists have to avoid the false positive (making untrue claims) to preserve their reputation. But those concerned with living in the world (people, policy makers, environmentalists) should be concerned with avoiding false positives (failing to address true problems).
I call this “THE MEDICAL MODEL” – we’d be disconcerted if our doc told us he/she was only 94% confident our lump was cancerous, and to come back in a year to see if it could get up to 95% so they could operate.
We buy insurance without much certainty a tornado or other harm will damage our house. The virtue is called prudence.
So too we should have rigorously started mitigating AGW at least by 1990 (5 years before the first scientific studies started reaching 95% confidence or the .05 alpha significance level in 1995) — esp since doing so actually saves us money and helps the economy (I found out to my amazement). Any other strategy is fool-hardy, profligate, and immoral.
What we need is not so much more and more science (tho keep up the good work, all you scientists!), but more and more persons with strength of character and morality, persons who are not ashamed to stand up, do the right thing, and mitigate AGW.
Re:#151
Black (grey) air?
Martin Vermeer has answered you already and I’ll try to express it another way. We are now down to the level of quantum theory. Matter can only radiate by jumping between quantum levels and in this case, the transition which involves the emission of photons has to satisfy another condition. If you start with something symmetrical like an oxygen molecule and make it vibrate it will still be symmetrical. That means that the negative charge in the molecule cannot have moved relative to the positive. The symmetrical vibrating molecule will not experience any interaction with a vibrating electric field. An incoming photon cannot excite such a vibration and conversely such a vibration cannot decay by emission of a photon. But things are quite different for triatomic molecules like CO2 and H2O which have electrically asymmetrical vibrational states.
So can air never radiate infra-red at all? I am now speculating and opening myself to correction. Might it be possible (at very low level) by breaking the symmetry in a collective way? The first possibility might be during an oxygen-nitrogen (and possibly an O2-O2 collision) collision when we have a different set of quantum states existing for a short time. Has this ever been considered? Tyndall never observed it , as far as I know but that was along time ago. It is thus likely to be far too small to affect the greenhouse.
A quite different question. A low level contrarian argument is based on the assertion that CO2 is only a tiny proportion of the air. The rough answer of course is that CO2 and H2O are 100% of what matters and that this argument diverts attention to the less significant non absorbing parts of the atmosphere. But what about a more careful thought experiment, what would happen to our greenhouse if the oxygen and nitrogen were removed? This would remove a heat sink and alter the convection and lapse rate. Is the answer obvious?
Re 158
Well said, Lynn Vincentnathan. But strength of character and morality are not sufficient, when there are such powerful forces which negate efforts to mitigate AGW. Many decades of indoctrination and propaganda have produced massive inertia. Maybe it takes a sort of social Kuhnian paradigm shift, requiring that the old thinking in the old (and powerful) heads has to die before anything really changes….e.g.
http://www.orionmagazine.org/index.php/articles/article/2962
Alastair #157:
Good question! According to Wikipedia, Kirchhoff discovered black-body radiation already in 1862.
Kirchhoff’s law can be proven from the principle of thermodynamic equilibrium. I’m sorry to use Wikipedia again, but I don’t own any statistical physics textbook; any good one should do.
(I didn’t manage to find anything on Einstein 1915 regarding Kirchhoff’s law. Did Einstein do experimental work? I know he worked theoretically on stimulated emission.)
What Max Planck did was find a closed analytical expression for this empirical curve (a surprisingly fractious problem!). And reluctantly conclude that a reasonable physical interpretation would be that the radiation comes in “packets” of energy proportional to the frequency… and the rest is history :-)
Back to the Present – it looks like a blob of unusually warm air from eastern Siberia is about (May 6th) to flow over the thinnest part of the Artic ice. Will the reported “unusually thin ice” melt through a month or two early, or is the ice actually not quite so fragile?
I suspect somewhere between the two. See: NH Jetstream and Northern Sea ice
> http://www.orionmagazine.org/index.php/articles/article/2962
Excellent article.
Those with old fading eyes like mine, note, this link is “orion” like the constellation, not “onion” like the humor magazine.
Re #159 Geoff Wexler:
This was shortly mentioned in the comments to raypierre’s Venus article:
https://www.realclimate.org/index.php/archives/2008/03/venus-unveiled/#comment-82917
#125 Hank, didn’t think I used any math, or did I?
Lowell wrote (#133): “Methane concentrations in the atmosphere have stabilized and are probably falling now so all the doom and gloom discussion about Methane releases are not supported by the facts.”
That is incorrect. According to the latest data from NOAA, methane levels rose sharply in 2007 after a decade of stability:
Greenhouse Gases, Carbon Dioxide and Methane, Rise Sharply in 2007
ScienceDaily
Thursday 24 April 2008
Excerpt:
Unfortunately, “gloom and doom” discussion about methane is supported by the facts.
Martin and Alastair, I recommend the derivation of the Planck distribution and related material in Landau and Lifshitz, “Statistical Mechanics”–elegant and straightforward. Well, as straightforward as anything ever is in stat mech.
A brief social science on the end of this article:
All sciences benefit from skilled synthesizers, especially those who can make the core ideas of a field accessible to those outside the field. However, the current organization of higher education, and the way resources are allocated within it, provides the greatest reward for those who publish small pieces of new data in referred journals over those who write the book length pieces necessary for synthesis. To accomplish a change of emphasis in paleo-climatology would require reforming the entire higher education enterprise — something that would be a good idea, but not likely to happen quickly enough to make a difference for this particular project.
It’s fun watching all those people picking over computer architecture vs. performance details I worked through for my PhD c. 1996 (John Mashey may even remember it: he was an examiner).
No one has asked me to help with performance problems since then yet a lot of the “big iron” problems of the early 90s have come back as commodity performance problems with multicore.
But anyway, back to the main question: I find it really encouraging that the paleo climate is starting to get more attention. Of course there are huge difficulties but anything that sheds more light on where we are going has to be good (except for the doubters, whose standard of proof will increase to 200% — Lynn Vincentnathan #158).
Re #134 Greg Smith:
Presumably you are similary dismissive of the modelling done by earth scientists? For example:
Determining Chondritic Impactor Size from the Marine Osmium Isotope Record
François S. Paquay, Gregory E.
Science 11 April 2008: 214-218.
The difference in osmium concentrations and isotopes between seawater and asteroids allows reconstruction of impact occurrence and size, including for the Cretaceous.
Re #166 and others about methane levels…
One need to look at the real figures… The “sharp” increase of methane levels in 2007 is pushing methane levels from about 1790 to 1795 ppbv, not much beyond the year-by-year variability of methane levels in the past decade, which are near stable.
See: http://www.esrl.noaa.gov/gmd/ccgg/iadv/ and look for CH4.
Some interesting point is that the d13C content of methane is slightly increasing. This may point to a change in source (either more human induced – like from rice paddies) or more fossil methane, but I have no figures seen (yet) to know the d13C content of different methane sources. Unfortunately d13C measurements in methane only started at the moment that methane levels stabilised.
About historical events: One can’t compare the current situation with the PETM event, where enormous levels of methane were released, far more than natural oxydation could transform into CO2 and water. Better compare to the previous interglacial, the Eemian.
The Eemian in average was up to 3°C warmer than current, during at least 5,000 years. This caused melting of a (large?) part of the Greenland ice sheet and temperatures in the Alaskan tundra of at least 5°C warmer than current. If we may suppose that the higher temperatures were also present in the rest of the Arctic, then we can have a clue what the permafrost thawing can give for extra methane.
Methane levels in ice cores followed temperature trends during the whole Eemian (see here) and reached a maximum of about 700 ppbv at maximum temperature, from a level of 650 ppbv at a temperature equal to the current temperature. Today we have reached 1795 ppbv, largely due to human emissions, where land use change may be the largest contribution. Thus even if the temperature increases to Eemian levels, that would add not more than 50 ppbv to the current methane levels…
Re posts 147 and 170
I don’t regularly post on any forums but do feel compelled to answer both of the above. Geologists and geophysicists use models constantly to try and evaluate what we can’t see or directly measure, that is, what is under the earth. We also have a pretty good appreciation of palaeoclimate both on a long term (ie geological) time frame and also over a human scale. We need to understand the latter to make predictions of rock facies away from known data points such as a well bore or mine. It is making predictions under uncertainty that is our business and the more successful of us are better at putting together the disparate pieces of often unreliable or fragmentary information required to make predictions. We also understand that no model, no matter how sophisticated with wonderful visualisation etc is accurate. This is why we use Monte Carlo simulation so much. Even the most sophisticated reservoir modelling software used to make predictions of how, say, a petroleum reservoir will perform will only give an approximation of the real thing and will be subject to the accuracy of the algorithms used and the quality of the imput data. Such simulators which until the last ten years required supercomputers to run them, may be modelling 30 to 40 variables and are nothing like trying to model the atmosphere.
In addition I have not read anywhere how climate modellers account for long term (for humans that is) variations in climate such as the PDO which is making the news at present. Nor have been able to find out how you account for random, but frequently happening, events such as volcanic eruptions. Both of these events obviously change the climate. In addition how do you estimate changes in the level or otherwise of the level of forestation. Don’t forget that North Africa kept much of the Roman Empire happily fed for over 400 years. I could look it up but I doubt that there is alot of wheat grown in Libya these days.
I guess I could also say that when people start saying things such as there is a “consensus” or the “science is settled” that I am unnerved which explains my reference to Copernicus. We can never fully understand a complex system such as the earth with millions of variables and to say otherwise is arrogant . For example electromagnetic theory in the 1890s seemed to physicists at the time to give a full explanation of interaction between particles and electricity/magnetism but the physicists were constrained by their inability to adequately measure very small scale things and had their hats knocked into a snook when Einstein came along. (As an aside, I wonder who peer reviewed him? “Peer review” is also a term I read frequently on blogs such as this)
The point of all of this is that we never know enough about the physical world, something we have only been able to measure or image half adequately in the last twenty years or so.
So when a poor sod like Gus-bob asks a few, to you, dumb questions just remember to treat him with respect.
Re #134: Damn, does this mean I should chuck my book on the Physics of the Earth’s Interior? The blurb on the back says it’s for geophysicists and earth scientists, but it only has physics and maths (models) in it :-(
More seriously, in the modern sciences we can’t get very far before running smack into a model or the need for one: whether it is statistical only, mathematical, physical, or a more qualitative construction, a model is about the only thing that connects data to ideas about how that data came to be. Mathematical models based on the physical aspects of a system IMO push research towards understanding far more effectively than waving hands about and saying it is all too complicated.
#166 Secular Animist,
Further to what you post.
1)
NSIDC now state the current Arctic ice pack state points to a likelihood of a new record minima this year. http://nsidc.org/arcticseaicenews/index.html That’s despite the past behaviour of the ice pack indicating that a new minima is not followed by a new minima in the next year. Maslowski’s projection of an ice-free arctic summer by 2013 may indeed be optimistic as it didn’t include last year’s events. We could see the transition to a seasonally free Arctic happen as fast as figure 3 of Nghiem “Rapid reduction of Arctic perennial sea ice” (2007) implies:
http://seaice.apl.washington.edu/Papers/NghiemEtal2007_MYreduction.pdf 303kb pdf
2)
2007’s melt caused surface Arctic Ocean waters in the exposed region to warm by as much as 5degC.
http://www.sciencedaily.com/releases/2007/12/071212201236.htm
3)
And at least part of the 540 billion tonnes of methane (clathrate) on the Arctic ocean shelf floor may alread be beginning to destabilise.
http://www.spiegel.de/international/world/0,1518,547976,00.html
QUOTE
Data from offshore drilling in the region, studied by experts at the Alfred Wegener Institute for Polar and Marine Research (AWI), also suggest that the situation has grown critical. AWI’s results show that permafrost in the flat shelf is perilously close to thawing. Three to 12 kilometers from the coast, the temperature of sea sediment was -1 to -1.5 degrees Celsius, just below freezing. Permafrost on land, though, was as cold as -12.4 degrees Celsius. “That’s a drastic difference and the best proof of a critical thermal status of the submarine permafrost,” said Shakhova.
Paul Overduin, a geophysicist at AWI, agreed. “She’s right,” he said. “Changes are far more likely to occur on the sea shelf than on land.”
ENDQUOTE
Would an increasing amount of open water in the Arctic summer cause an increasing amount of vertical mixing through wind/sea-surface induced transport/turbulence?
I suspect so, but I’m not qualified enough to be sure either way.
In 162 Ken Rushton writes “Back to the Present – it looks like a blob of unusually warm air from eastern Siberia is about (May 6th) to flow over the thinnest part of the Artic ice. Will the reported “unusually thin ice” melt through a month or two early, or is the ice actually not quite so fragile?” I wonder if soimeone can clear up a mystery for me. I understand that “old” ice is thicker and melts slower than new ice, but I dont think I understand annual ice. This is the approximately 9 m sq kms of ice that melts and freezes every year, and so is, by definition, always less than one year old. It is my understanding that the thickness of annual ice is purely a function of how soon winter arrives, and how cold it gets. In many parts of the Arctic this season, winter arrived early, and was colder than average, so the annual ice in these parts is reported to be thicker than average. I have seen such reports from Hudson Bay, and the Bering Strait. What is the situation with respect to annual ice in the Arctic at the present?
Greg Smith, I will give you the benefit of the doubt that perhaps you are unfamiliar with Gusbob’s history. Suffice to say he came on here and hijacked more than one thread with off-topic diatribes on pure pseudoscience. Given that history, I would contend that he is treated with exceptional tolerance here.
As to your other contention that one must throw up one’s hands when confronted with a complicated system, I can only conclude that you have not done much modeling. Global climate models are dynamical models, and the parameters in the models are not adjustable parameters, but rather constrained by independent data. Such models are inherently more manageable than statistical models, where fit to the data determines parameter values and their error bars.
To say that we never know enough about the physical world is meaningless. Enough to do what? We do not understand everything about the solar system, and yet this does not stop us from landing space probes on planets and asteroids. Likewise, there is much we do not understand about climate, but what we do not understand does not invalidate our well constrained understanding of CO2 and other greenhouse gasses.
What is more, climate scientists understand the difference between climate–a purely long term entity–and weather–under which we can file things like variability of PDO, etc. and which average out over time. They also understand that paleoclimate–which deals with ancient climates and is inherently on timescales much longer than a human life. Since you do not understand these things–and much else as well, it seems–I would contend that you are not in a position to comment competently on issues of climate science.
When I look at the plot (The Eemian in the Vostok Ice Core which was referenced by Ferdinand Engelbeen in his comment #171), I do not see how he can claim that “Methane levels in ice cores followed temperature trends during the whole Eemian”. ?
I have an question about the sensitivity of the climate system of our Planet.
How is it?
The vegetation on Earth is like a puffer for climate change?
Therefore can i say that our vegetation is importand for the sesitivity?
As i understand it you use examples of climate change to explane the future,
In the past there was a functioning ecosystem, with a lot more biomass on the continets.
( at least in times with same clobal temperatur ), so in the past the ecosystem reacted with the warming process. For example rainforests. Through evaporation they create a cloudcover, like a protection for heat. Hotter temperatures they started crowing towards the pols, maybe, more cloudcover for the continents, and a slower heating up of the whole. ( think most kinds of forests are doing this, more or less ) Also if i remember it correct, in much hotter times, most of our continets were covert by rainforrests and other kinds of forests, was like a puffer for continental temperatures i guess.
In our times we will have finished the rainforests maybe in 2040. More and more area will be needed for food production, while exhausted ground will be left behind. With or without clobal warming, we will have finished our ecosphere by 2100, growing numbers and so.
(only war, or climate change can stop this)
If this is not stupit and makes sense i come to my main question.
Is this principle inside the ipcc report or not? Sinking sesitivity of the system towards 2100
Sorry for interrupting
re: 172 and volcanic effects. The effects from volcanic eruptions on climate are short-term compared with the long-term warming trend. For example, look at the effects after Pinatubo. There was short-term cooling but the warming trend rapidly resumed. Furthermore, volcanic eruptions do not produce more CO2 than mankind despite the skeptic’s urban myth to the contrary.
Re 172:
Disrespect? With monumental patience, all of gusbobb’s points and question have received detailed, relevant, lengthy and continuing responses. Disrespect would be to ignore him. The peevish and frustrated tone of some of the responses is no more than I would expect if I were to enter a geophysics forum and blithely overturn tectonic theory with the latest Hollow Earth ideas.
Re Greg Smith @ 172: “So when a poor sod like Gus-bob asks a few, to you, dumb questions just remember to treat him with respect.”
Gregg, if you peruse gusbob’s earlier comments in the Galactic Glitch thread at
https://www.realclimate.org/index.php/archives/2008/03/a-galactic-glitch/langswitch_lang/sw
you’ll see that he is hardly the ‘poor sod’ thaat you assume, though to his credit he is now making an attempt to understand what people have been trying to tell him for quite sometime now.
Looks like the Ice has taken a recet nose dive below 2007 and its only May.
http://nsidc.org/data/seaice_index/images/daily_images/N_timeseries.png
Martin and Ray,
I am not looking for a derivation of Planck’s function. What I would like to know is the justification for using Planck’s function for black-body radiation to calculate the emissions of line radiation, a very different beast. The Fraunhofer lines are where solar radiation does NOT fit Planck’s curve.
Cheers, Alastair.
This is simpl a restatement of the old denialist canard “if we can’t understand everything, we understand nothing”, used by anti-evolutionists, anti-climate scientists, anti-everythingists.
If you’re really a scientist, Greg, surely you can do better than this.
I posted this on another thread this morning, but it seems to fit better here:
If you want to so something “chilling,” watch this Japanese video of polar satellite images, showing the disintegration of multi-year ice during last winter.
http://www.homerdixon.com/download/arctic_flushing.html
From the video:
The image is a low-resolution reproduction of a sequence of satellite images of Arctic ice this past fall and winter. The sequence runs in a continuous loop from October 01, 2007, to March 15, 2008. A link to the high-resolution video file is also provided.
Note the stream of multi-year ice flowing out of the Arctic basin down the east coast of Greenland at one o’clock in the image. As of the middle of March, most of the basin, including the pole itself, appears to be covered only by seasonal ice.
Alastair, Blackbody radiation is not a distinct type of radiation from line radiation. Atoms, molecules, and solids can only radiate between their energy levels, so “blackbody” radiation is composed of all such modes that are thermally excited. That is why you get a much better approximation of a blackbody spectrum from a solid than a gas (more modes to add togetether and more thermal motion to smear out those modes).
The Planck distribution is simply the equilibrium energy density of a photon gas at temperature T. However, since photons are noninteracting, equilibrium can only be achieved by the photon gas interacting with surrounding matter. In effect, once the radiation field is in equilibrium with the matter, there will be as many excited modes decaying radiatively as there are photons exciting new modes, and the energy of the mode multiplied by the number of photons/phonons/collisions… capable of exciting the mode will give the energy density.
Re #177 Pat N,
The plot is a little confusing, but if you look at the beginpoints and endpoints of the warming and of the CH4 increase, there may be a (in geological terms) slight lag of CH4 increase after temperature increase of less than 1,000 years. After the maximum temperature/CH4 levels ended, there is no visible lag of CH4 if you take the (older) temperature proxy of Petit e.a. as base, or CH4 leads the temperature drop with several thousands of years during a long period, but lags again at the moment that temperature is at minimum, if you take the (more recent) temperature proxy by Jouzel as base.
It is physically possible that already starting ice sheets growth led to stronger reductions of methane release than can be deduced from the temperature curve…
Anyway, it doesn’t look that methane release from permafrost and/or clathrates will be a problem for this century. A temperature increase of 3°C will not give more than 100 ppbv more methane (made an error in #171: the CH4 scale in the graph needs to be doubled…). This leads to an increase of retained warmth (without feedbacks) of about 0.06 W/m2, not really impressive…
#171, Ferdinand Engelbeen,
From the graphs of yours:
“This points to a low influence of CO2 on temperature.”
Does it really?
What I see is a process being initiated by insolation maxima for “Spring” (April May June), as suggested by Hansen et al in “Climate change and trace gases”, fig 2b, table 1, and accompanying text. http://pubs.giss.nasa.gov/abstracts/2007/Hansen_etal_2.html
CO2 and Methane track closely until greater areas of the surface are exposed by the receding ice-sheets and made available to biological activity at 13k yr BP. That’s when the methane races ahead of CO2, it rises in tandem with ice sheet recession.
Likewise your statement about 113k yr BP seems flawed to me because again you’re seeing the methane level drop as more physical area is covered by ice, thus reducing it’s unit-area biological output.
CH4’s atmospheric residence time is much less than that of CO2 (for which Ocean/Atmosphere must be considered), so I’d expect it to react more rapidly to drops in temperature.
As for the Eemian/PETM as an analogue, I’m not so sure that the past offers much of a guide given the multiplicity of human impacts. We’ll find out soon enough whether the reduction of Southern Ocean CO2 uptake, the expansion of the Hadley Cells, the ongoing transition of the Arctic to a seasonally ice-free state, are part of a wider pattern of changes ahead of time, or if they’re outliers/”noise”.
From where I’m sitting it really looks like we’re much deeper into the briar patch than we can yet assert to a scientifically robust degree.
Greg Smith wrote: “Nor have [I] been able to find out how you account for random, but frequently happening, events such as volcanic eruptions.”
I’m very surprised by this statement, since James Hansen’s earliest circulation model from the 1980s included a random volcanic eruption and the model prediction matched the short-term aerosol cooling effect of the 1991 Pinatubo eruption fairly closely. This has been pretty widely discussed as an example of the model’s ability. One would have to not look very thoroughly to not know this.
Hi,
I was wondering if there is anywhere on this site where I can find a response to the following report:
http://epw.senate.gov/public/index.cfm?FuseAction=Minority.SenateReport#report
I would like to make it clear that this report does not in any way undermine my view that we must strive to to reverse the effects of climate change that most experts agree are occurring. I would also like to add that I think it is amazing that the mdeida require a very high burden of proof that the climate change fears are justified. Let’s face it, if most expert in nuclear physic were to warn us of a 10% chance of meltdown in a nuclear plant, everyone would clamour to have it shut down. Yet the dangers of climate change are almost infinitely greater. Nonethless, if anyone can point me to well-inforemd response to the document I’ve just mentioned, I would be most grateful.
Thanks and best wishes,
Brian.
#183 Alastair,
Oh, but it does! The Planck curve of a level higher up in the Solar atmosphere, were temperatures are lower. Higher opacity (stronger absorption) means that the photosphere — the fuzzy visible surface from where light escapes to space — is at a higher level within these lines than in the continuum.
Also in the Solar atmosphere, like in the Earth one, there is a negative temperature gradient with height. Only, we’re talking about visible light here, not infrared. The negative gradient can also be observed in the form of “limb darkening”.
On a conceptual level, there is no difference between black-body radiation and line radiation. It’s all Planck, just the absorbtivity == emissivity varies.
Re 176 Ray Ladbury
Ray, you have not answered my questions. Perhaps you or others in this forum could point me towards a site/papers that explain exactly and concisely what variables are in your “dynamic models” and what constraints are placed on the data that are entered into these models and who decides which set of variables are relevant which ones are not and why. In addition could you also please explain the algorithms and their inherent assumptions and error bars which have been used to drive the models, the outcomes of which are being used to drive fundamental changes to the world’s economy. I, for one, would like to know how you understand they work before being dismissed as incompetent to talk about climate science. As I said in my initial post I am neither a sceptic nor a warmer but I am a scientist and want to make scientific assessments based on actual data and not rhetoric. I too am a fellow passenger on this space ship GIGO
Pete wrote:
“Looks like the Ice has taken a recet nose dive below 2007 and its only May.”
Pete, I expect you saw a temporary hiccup in the charting software output, which the site warns can happen. Click “Data Note” on the home page:
http://www.nsidc.org/arcticseaicenews/disclaimer1.html
re #187, tis true, its half way between record and nominal
Ray Ladbury writes:
I think CO2 actually does have other absorption lines, e.g. at 4.7 microns; it’s just that 14.99 microns is the real big one.
Geoff Wexler writes:
The greenhouse effect would be a bit less due to the lower total pressure.
Greg Smith writes:
Agriculture is Libya’s second largest economic sector after oil.
ANY scientist in ANY field would be familiar with “the scientific consensus” and would realize that that and peer review are how modern science works.
Nobody is saying we can fully understand it. We can understand it well enough to draw a number of conclusions. That has been true ever since Torricelli and others established that higher altitudes were generally colder and that pressure fell with altitude back in the 1600s.
Einstein’s peer reviewers were the editors of the peer-reviewed journal, Annalen der Physik, which he published in.
Greg, I found this website about climate models to be helpful:
http://tinyurl.com/5mdu9y
Does anybody have a link to a debunking of the “400 scientists” list? I think I saw this covered on Deltoid and a few other science blogs, but I don’t remember the details. I think the main point was that most of the people on the list are not climatologists. Ross McKittrick, if I’m not mistaken, is a mining engineer.
Eric (skeptic) – if you’re still on this page, I’ll post some information on clouds for you. It shows that reduction in albedo from 1985 to 1998 contributed many times more to global warming than greenhouse gases. Albedo has been increasing again in the last few years.