A few weeks ago I was at a meeting in Cambridge that discussed how (or whether) paleo-climate information can reduce the known uncertainties in future climate simulations.
The uncertainties in the impacts of rising greenhouse gases on multiple systems are significant: the potential impact on ENSO or the overturning circulation in the North Atlantic, probable feedbacks on atmospheric composition (CO2, CH4, N2O, aerosols), the predictability of decadal climate change, global climate sensitivity itself, and perhaps most importantly, what will happen to ice sheets and regional rainfall in a warming climate.
The reason why paleo-climate information may be key in these cases is because all of these climate components have changed in the past. If we can understand why and how those changes occurred then, that might inform our projections of changes in the future. Unfortunately, the simplest use of the record – just going back to a point that had similar conditions to what we expect for the future – doesn’t work very well because there are no good analogs for the perturbations we are making. The world has never before seen such a rapid rise in greenhouse gases with the present-day configuration of the continents and with large amounts of polar ice. So more sophisticated approaches must be developed and this meeting was devoted to examining them.
The first point that can be made is a simple one. If something happened in the past, that means it’s possible! Thus evidence for past climate changes in ENSO, ice sheets and the carbon cycle (for instance) demonstrate quite clearly that these systems are indeed sensitive to external changes. Therefore, assuming that they can’t change in the future would be foolish. This is basic, but not really useful in a practical sense.
All future projections rely on models of some sort. Dominant in the climate issue are the large scale ocean-atmosphere GCMs that were discussed extensively in the latest IPCC report, but other kinds of simpler or more specialised or more conceptual models can also be used. The reason those other models are still useful is that the GCMs are not complete. That is, they do not contain all the possible interactions that we know from the paleo record and modern observations can occur. This is a second point – interactions seen in the record, say between carbon dioxide levels or dust amounts and Milankovitch forcing imply that there are mechanisms that connect them. Those mechanisms may be only imperfectly known, but the paleo-record does highlight the need to quantify these mechanisms for models to be more complete.
The third point, and possibly the most important, is that the paleo-record is useful for model evaluation. All episodes in climate history (in principle) should allow us to quantify how good the models are and how appropriate are our hypotheses for climate change in the past. It’s vital to note the connection though – models embody much data and assumptions about how climate works, but for their climate to change you need a hypothesis – like a change in the Earth’s orbit, or volcanic activity, or solar changes etc. Comparing model simulations to observational data is then a test of the two factors together. Even if the hypothesis is that a change is due to intrinsic variability, a simulation of a model to look for the magnitude of intrinsic changes (possibly due to multiple steady states or similar) is still a test both of the model and the hypothesis. If the test fails, it shows that one or other elements (or both) must be lacking or that the data may be incomplete or mis-interpreted. If it passes, then we a have a self-consistent explanation of the observed change that may, however, not be unique (but it’s a good start!).
But what is the relevance of these tests? What can a successful model of the impacts of a change in the North Atlantic overturning circulation or a shift in the Earth’s orbit really do for future projections? This is where most of the attention is being directed. The key unknown is whether the skill of a model on a paleo-climate question is correlated to the magnitude of change in a scenario. If there is no correlation – i.e. the projections of the models that do well on the paleo-climate test span the same range as the models that did badly, then nothing much has been gained. If however, one could show that the models that did best, for instance at mid-Holocene rainfall changes, systematically gave a different projection, for instance, of greater changes in the Indian Monsoon under increasing GHGs, then we would have reason to weight the different model projections to come up with a revised assessment. Similarly, if an ice sheet model can’t match the rapid melt seen during the deglaciation, then its credibility in projecting future melt rates would/should be lessened.
Unfortunately apart from a few coordinated experiments for the last glacial period and the mid-Holocene (i.e. PMIP) with models that don’t necessarily overlap with those in the AR4 archive, this database of model results and tests just doesn’t exist. Of course, individual models have looked at many various paleo-climate events ranging from the Little Ice Age to the Cretaceous, but this serves mainly as an advance scouting party to determine the lay of the land rather than a full road map. Thus we are faced with two problems – we do not yet know which paleo-climate events are likely to be most useful (though everyone has their ideas), and we do not have the databases that allow you to match the paleo simulations with the future projections.
In looking at the paleo record for useful model tests, there are two classes of problems: what happened at a specific time, or what the response is to a specific forcing or event. The first requires a full description of the different forcings at one time, the second a collection of data over many time periods associated with one forcing. An example of the first approach would be the last glacial maximum where the changes in orbit, greenhouse gases, dust, ice sheets and vegetation (at least) all need to be included. The second class is typified by looking for the response to volcanoes by lumping together all the years after big eruptions. Similar approaches could be developed in the first class for the mid-Pliocene, the 8.2 kyr event, the Eemian (last inter-glacial), early Holocene, the deglaciation, the early Eocene, the PETM, the Little Ice Age etc. and for the second class, orbital forcing, solar forcing, Dansgaard-Oeschger events, Heinrich events etc.
But there is still one element lacking. For most of these cases, our knowledge of changes at these times is fragmentary, spread over dozens to hundreds of papers and subject to multiple interpretations. In short, it’s a mess. The missing element is the work required to pull all of that together and produce a synthesis that can be easily compared to the models. That this synthesis is only rarely done underlines the difficulties involved. To be sure there are good examples – CLIMAP (and its recent update, MARGO) for the LGM ocean temperatures, the vegetation and precipitation databases for the mid-Holocene at PMIP, the spatially resolved temperature patterns over the last few hundred years from multiple proxies, etc. Each of these have been used very successfully in model-data comparisons and have been hugely influential inside and outside the paleo-community.
It may seem odd that this kind of study is not undertaken more often, but there are reasons. Most fundamentally it is because the tools and techniques required for doing good synthesis work are not the same as those for making measurements or for developing models. It could in fact be described as a new kind of science (though in essence it is not new at all) requiring, perhaps, a new kind of scientist. One who is at ease in dealing with the disparate sources of paleo-data and aware of the problems, and yet conscious of what is needed (and why) by modellers. Or additionally modellers who understand what the proxy data depends on and who can build that into the models themselves making for more direct model-data comparisons.
Should the paleo-community therefore increase the emphasis on synthesis and allocate more funds and positions accordingly? This is often a contentious issue since whenever people discuss the need for work to be done to integrate existing information, some will question whether the primacy of new data gathering is being threatened. This meeting was no exception. However, I am convinced that this debate isn’t the zero sum game implied by the argument. On the contrary, synthesising the information from a highly technical field and making it useful for others outside is a fundamental part of increasing respect for the field as a whole and actually increases the size of the pot available in the long term. Yet the lack of appropriately skilled people who can gain the respect of the data gatherers and deliver the ‘value added’ products to the modellers remains a serious obstacle.
Despite the problems and the undoubted challenges in bringing paleo-data/model comparisons up to a new level, it was heartening to see these issues tackled head on. The desire to turn throwaway lines in grant applications into real science was actually quite inspiring – so much so that I should probably stop writing blog posts and get on with it.
The above condensed version of the meeting is heavily influenced by conversations and talks there, particularly with Peter Huybers, Paul Valdes, Eric Wolff and Sandy Harrison among others.
Unas semanas atrás estuve en una reunión en Cambridge donde se discutió como, o si, la información paleo climática puede reducir las incertidumbres conocidas en simulaciones climáticas futuras.
Las incertidumbres en el impacto que provoca el aumento de gases de invernadero en múltiples sistemas son significativas: el potencial impacto de ENSO Oscilación del Sur, El Niño (El Niño Southern Oscillation, ENSO, por sus siglas en inglés), o el vuelco en la circulación del atlántico norte, probable información sobre la composición atmosférica (CO2, CH4, N2O, aerosoles), la posibilidad de pronosticar los cambios climáticos por décadas la sensibilidad misma del clima global y, quizás la más importante es que sucederá con las capas de hielo y el régimen de lluvias regional en un clima más cálido.
La razón por la cual la información paleo climática puede ser clave en estos casos es que todos estos componentes climáticos han cambiado en el pasado. Si se pueden entender el cómo y el porqué de estos cambios en el pasado, se pueden modelar nuestras proyecciones de cambio para el futuro. Lamentablemente, el simple uso del registro, ir hacia atrás hasta un punto que tenga condiciones similares a las que esperamos para el futuro no funciona muy bien, ya que no hay buenas analogías a los trastornos que estamos causando. Nunca se ha visto antes un aumento tan veloz de gases de invernadero con la presente configuración de los continentes y vastas cantidades de hielo polar. Por ende, se deben desarrollar enfoques más sofisticados y esta reunión estuvo dedicada a examinar dichos enfoques.
La primera observación que se puede hacer es muy sencilla. ¡Si algo sucedió en el pasado, entonces eso significa que es posible! Así, cambios en ENSO, capas de hielo y los ciclos del carbono, por ejemplo, demuestran claramente que estos sistemas son realmente sensibles a cambios externos. Por lo tanto, asumir que no pueden cambiar en el futuro sería ingenuo. Esto es básico, pero no verdaderamente útil desde un punto de vista práctico.
Toda proyección futura cuenta con modelos de algún tipo. Lo que domina en el tema climático son los GCMs (Global Climate Model, GCM por sus siglas en inglés) modelos climáticos globales océano atmosféricos a gran escala, que fueron discutidos extensamente en el último informe del Panel Intergubernamental del Cambio Climático, conocido por sus siglas IPCC (Intergovernmental Panel on Climate Change en inglés), pero también se pueden usar otros modelos más simples, más especializados o más conceptuales. La razón por la que estos otros modelos son aún útiles es que los GCMs no son completos, no contienen todas las posibles interacciones que conocemos de los registros paleo climáticos y pueden ocurrir observaciones modernas. Segundo, las interacciones observadas en los registros como, por ejemplo, entre los niveles de dióxido de carbono o cantidades de polvo y el forzante de Milankovitch, implican que hay un mecanismo que los conecta. Estos mecanismos pueden ser solo imperfectamente conocidos, pero los registros paleo climáticos resaltan la necesidad de cuantificar dichos mecanismos para que los modelos sean más completos.
Tercero, y probablemente el tema más importante, es que los registros paleo climáticos son útiles para evaluar los modelos. Todo episodio en la historia climática, en principio, debiera dejarnos cuantificar cuan buenos son nuestros modelos y cuan apropiada es nuestra hipótesis sobre cambios climáticos en el pasado. Sin embargo, es vital tomar nota de un detalle, los modelos incluyen muchos más datos y suposiciones sobre cómo funciona el clima, pero para que el clima de los modelos cambie se necesita una hipótesis, como podrían ser un cambio en la órbita terrestre, actividad volcánica, cambios en el sol, etc. Comparar simulaciones de modelos climáticos con datos observados se convierte en una evaluación de los dos factores juntos. Aún cuando la hipótesis sea que los cambios se deben a variaciones intrínsecas, una simulación en un modelo buscando la magnitud de cambios intrínsecos (posiblemente debido a múltiples estados constantes o similar) sigue siendo una prueba de evaluación tanto para el modelo como para la hipótesis. Si la evaluación fracasa, es una muestra que uno o algunos elementos (o ambos) son deficientes, o que la información está incompleta o mal interpretada. Si la evaluación es exitosa, tenemos una explicación consistente de los cambios observados que puede, no obstante, no ser única, pero es un buen punto de partida.
Pero, ¿Cuál es la relevancia de estas evaluaciones? ¿Qué es lo que un modelo sobre impactos de cambios en el vuelco de la circulación oceánica del atlántico norte, o cambios en la órbita terrestre, puede realmente ofrecer para proyecciones futuras? Aquí se dirige la mayor parte de la atención. La clave desconocida es, si la habilidad de un modelo en una cuestión paleo climática esta correlacionada con la magnitud del cambio en una situación. Si no está correlacionada, por ejemplo, las proyecciones de los modelos paleo climáticos que dieron buenos resultados abarcan la misma esfera de acción que los modelos que dieron malos resultados, entonces no se ha ganado mucho. Sin embargo, si se puede demostrar que los modelos exitosos en, por ejemplo, cambios en el régimen de lluvias a mediados del Holoceno, dieron sistemáticamente proyecciones diferentes, por ejemplo, a grandes cambios en el Monzón índico bajo condiciones de gases de invernadero crecientes (Green House Gases, GHG por sus siglas en inglés), entonces se tendría una razón para sopesar las diferentes proyecciones de modelos para llegar a una evaluación revisada. Asimismo, si un modelo no puede coincidir con el rápido derretimiento de las capas de hielo en un período de deshielo, entonces se debiera disminuir la credibilidad al proyectar futuras estimaciones de derretimiento.
Desafortunadamente, aparte de unos pocos experimentos coordinados para el último período glacial y mediados del Holoceno (por ejemplo PMIP, Paleoclimate Modelling Intercomparison Proyect, por sus siglas en inglés) con modelos que no necesariamente superponen con los que se encuentran en el archivo AR4 (Fourth Assessment Report, AR4 por sus siglas en inglés), no existe una base de datos de resultados y evaluaciones. Se han mirado varios eventos paleo climáticos en modelos individuales, desde la Pequeña Edad de Hielo al Cretáceo, pero esto solo sirve como grupo de exploración adelantada que determina solo un trazado en la tierra en lugar de la hoja de ruta completa. De modo que nos enfrentamos con dos problemas: todavía no se sabe qué eventos paleo climáticos serían los más útiles (aunque todos tienen ideas propias), y no se tiene la base de datos que permitiría hacer coincidir las paleo simulaciones con las proyecciones a futuro.
Cuando se miran los registros paleo climáticos para sacar de ellos modelos de evaluación útiles, hay dos tipos de problemas: lo que sucedió en un período específico, y cuáles son las respuestas a un forzante o evento específico. Para el primero se requiere una descripción completa de los diferentes forzantes en el período específico, para el segundo, una recopilación de datos abarcando muchos períodos de tiempo asociados con un forzante. Un ejemplo del primer enfoque sería el último período glacial máximo donde se debieran incluir, por lo menos, los cambios en la órbita terrestre, los gases de invernadero, polvo, capas de hielo y vegetación. La segunda clase se caracteriza por su búsqueda de respuestas a erupciones volcánicas mediante el agrupamiento de todos los años posteriores a grandes erupciones. Se podrían desarrollar enfoques similares para la primera clase para el período Plioceno medio, el evento conocido como 8.2 kyr, el período Eemian (también conocido como Riss Würm, último período inter glacial), principios del Holoceno, la desglaciación, principios del Eoceno, el PETM (Paleocene-Eocene Thermal Maximum, PETM por sus siglas en inglés), la Pequeña Edad de Hielo, etc, y para la segunda clase, forzantes orbitales, forzantes solares, eventos de Dansgaard-Oeschger, eventos de Heinrich, etc.
Pero todavía falta un elemento. En la mayoría de los casos nuestro conocimiento sobre cambios en estos períodos de tiempo es fragmentario, está repartido desde docenas a cientos artículos, y sujeto a múltiples interpretaciones. Resumiendo, es una situación muy complicada. El elemento que falta es poner toda esa información junta y producir un resumen que sea fácilmente comparable con los modelos. El hecho que este resumen se hace solo en raras ocasiones hace hincapié en las dificultades que entraña. Hay buenos ejemplos de ello, CLIMAP (Long range Investigation, Mapping and Prediction, CLIMAP por sus siglas en inglés) y su reciente actualización, MARGO (Multiproxy approach for the reconstruction of the glacial ocean surface, MARGO por sus siglas en inglés) por las temperaturas oceánicas del último período glacial máximo(LGM, por sus siglas en inglés), las bases de datos de precipitación y vegetación para el período Holoceno medio en PIMP, especialmente la resolución de los patrones de temperatura de los últimos cientos de años de múltiples servidores porxy, etc. Se han utilizado con mucho éxito cada uno de estos sistemas en modelos de comparación de datos y han sido muy influyentes dentro y fuera de la comunidad paleo climática.
Parece raro que este tipo de estudio no se realice con mayor frecuencia, pero tiene sus razones. Fundamentalmente se debe a que las herramientas y las técnicas requeridas para obtener un buen resumen no son las mismas que para hacer mediciones o desarrollar modelos. Se podría describir como un nuevo tipo de ciencia (aunque en la ciencia no sea nuevo para nada) que requeriría, quizá, de un nuevo tipo de científico, alguien que se sintiera cómodo con las distintas fuentes de datos paleolíticos y que fuera consciente de los problemas, y aún así fuera consciente de lo que los modelistas necesitan y por qué lo necesitan. O, por otra parte, se necesitarían personas que trabajen en modelos que entiendan de qué dependen los datos indirectos y quién puede desarrollarlos en los modelos, haciendo de este modo más directa la comparación de datos con los modelos.
¿La comunidad paleolítica debiera, entonces, incrementar el énfasis en la síntesis y asignar más fondos y posiciones consecuentes? Esto es, a menudo, una cuestión polémica, ya que cuando se discute la necesidad de trabajadores para integrar la información existente, algunos se preguntan si la primacía de la recopilación de datos nuevos está siendo amenazada. Esta reunión no fue una excepción. Sin embargo, estoy convencido que este debate no es la suma cero implícita en el argumento, sino todo lo contrario. Resumir la información desde un campo altamente técnico y hacer dicha información útil para otros fuera del campo es fundamental para aumentar el respeto por este campo de estudio y así incrementar la cantidad de fondos de investigación disponibles a largo plazo. Sin embargo, la falta de gente capacitada que podría ganarse el respeto de los recolectores de información y que podrían entregar un producto con valor agregado a los encargados de ejecutar los modelos presenta un serio obstáculo.
A pesar de los problemas y de los incuestionables desafíos que representa llevar datos paleolíticos y comparaciones de modelos a un nuevo nivel, fue alentador ver que estos temas fueron abordados de frente. El deseo de tirar líneas para solicitudes de subvención para la ciencia real fue muy inspirador, tanto así, que yo debiera dejar de escribir blogs y dedicarme a ello.
La versión condensada de la reunión está altamente influenciada por conversaciones y charlas de la misma, particularmente con Peter Huybers, Paul Valdes, Eric Wolff y Sandy Harrison entre otros.
> angular momentum … varies depending on
> what it is related to.
Rod, can you take a bucket of water, tie a rope onto it, start swinging it so it’s circling around you, then relate it to something else so the water falls out?
Rod,
Temperature predates atomic interpretations of thermodynamics by hundreds of years. It was known to be a valid concept for gasses (where degrees of freedom are largely kinetic), liquids (where things are still largely kinetic, but motion is more restricted and solids (where motion is mostly vibrational). And the temperature was measured in the same way: insert a “thermometer” and allow it to come into thermal equilibrium with the material you want the temperature for and read how much the material in the thermometer has expanded. Alternatively, you can measure how much resistance in a thermistor changes. The thing is that it doesn’t matter whether the energy is kinetic, rotational, vibrational, electronic, etc. What matters is that the thermometer and the medium will exchange energy until they are at the same temperature–that is until they come into equilibrium. Note that the concept of equilibrium is critical here. Two bodies in contact and equilibrium will have the same temperature. Likewise, if a system is not in equilibrium–e.g. if equipartition does not apply–then you’ll get ambiguous answers for its temperature.
I have emphasized that temperature is defined as the partial derivative of energy wrt entropy, holding all other variables (e.g. volume, numbers of particles, etc) constant. This definition actually predates the kinetic interpretation of temperature by several decades. In fact, the only reason the kinetic interpretation was introduced was because the physics of billiard balls is easier to interpret than that of particles with internal degrees of freedom. Where you are running into trouble is in trying to apply a kinetic interpretation of temperature to a single particle/molecule. This quite simply is not valid. Statistical mechanics only works in the limit of large numbers of particles–and that includes any definition of temperature. This is one reason why nanoparticles are such a hot topic–how small does the particle have to get before you start seeing significant departures from normal thermodynamic behavior. The other frontier in stat mech is nonequilibrium stat mech. This is a very difficult. Fortunately, planetary atmospheres are sufficiently close to equilibrium that the normal rules of stat mech apply.
Ray, Hank re angular momentum: First, Hank, we’re discussing a mass with linear velocity and momentum and not physically tied to a disc-blob or its axis but with angular momentum about that axis. Getting angular momentum from a rotating (about an axis) mass relative to another different axis, ala the swinging bucket of water, is beyond my comprehension, so I need to go one step at a time.
I think we concluded and agreed 1) the bullet has angular momentum ( L ) about its target (a fixed rotatable disc). 2) That L is constant through time and distance from where the bullet was shot (100m from the disc blob say) all the way to the disc. 3) When the bullet strikes the disc (off center) and its applied torque spins the disc, the combined L_subT of the spinning disc with its embedded bullet will be the same as the bullet’s initial L. 4) if the bullet is mis-aimed and misses the target disc, it still has L relative to the disc. And, 5) the mis-aimed L is slightly different from the original L by virtue of a slightly different angle between P and R. The latter two might be in contention….
I contend: The bullet does not have to actually strike (and end up with a physical connection with) its rotatable target to have L about it. It can miss it big. If it can miss the original target and still have L about it, it ought to be able to also miss another different fixed rotatable target significantly distant from the first target (just to make it easier to draw), though in the same fixed unmoving coordinate system. The bullet then would have L_2 about the second target, since it is fired only once, at the same instant(ces) and from its specific position at any instant that it has L_1 about the first target, wouldn’t it? This L about the second “target” is numerically different from the first. This tells me that a single bullet with specific, explicit and inherent linear momentum can have two different numerical angular momentums depending on the relative location of two different targets. What is wrong with this? Can’t have L about target #2? Then how can it have L about target #1, at least if the aim is a bit off?
Finally, if it has two Ls, its a no-op jump to 3, 4, 20, 87, 1,000,000, etc.
Ray (502), I don’t think we are going to resolve this, but let me ask about just one point for now. You said “… Likewise, if a system is not in equilibrium–e.g. if equipartition does not apply–then you’ll get ambiguous answers for its temperature….”
I thought equipartition was an intramolecular (one) property. If equipartition is off kilter, why would one get an ambiguous temperature, as (you claim — though as you know, I disagree) temperature for a single molecule is a no-op and doesn’t exist? Or did I misread your statement?
ps to my #503: Angular momentum characteristics are actually more obvious that complicated stuff I got bogged down in. My bad. In a forceless constant frame/coordinate system containing bodies and a fixed observer, a body’s numerical linear momentum never varies no matter what. A body’s numerical (and vector) angular momentum can vary infinitely depending on what axis of rotation the observer chooses — my linearly moving bullet relative to any axis of all those other bodies in the example, or, say, a single cylinder rotating about 1) its centerline, or 2) an axis through it’s cross-section, or 3) an axis anywhere.
Different thought: can I assume the angular momentum of a photon, as small as it is, has to appear only in the rotation of a molecule after absorption; and likewise has to be taken from rotation when a photon is emitted? Does this somehow (or noticeably) alter the rotational energy pickup of a photon’s energy?
Re #505
Rod,
In classical physics momentum, not angular, is m * v, i.e. the product of mass and velocity. As Einstein showed, photons have no mass but they do have momentum, which is equal to h / wavelength, where h is Plank’s constant.
Thus, when photons collide with molecules the law of conservation of momentum is conserved, and the molecules do recoil.
If two mono-atomic Argon molecules collide, then the result is similar to that of a billiard balls. However, if two tri-atomic carbon dioxide molecules collide, their shape makes the result much more complicated. For instance, if they meet head on oxygen to oxygen, then the result will be a stretching oscillation, but if they meet head on oxygen atom to middle carbon atom, the result will be a bending oscillation. If it is a glancing blow, the the result will be a rotation, but in general is will be a combination of all three.
However, when CO2 is only 300 parts per million of the atmosphere, then a CO2 – CO2 collisions are unlikely. CO2 molecules are more likely to receive collisions with diatomic molecules such as O2 and N2. The geometry of the collisions with these molecules will also affect the resulting excitation of the CO2 molecule. It is all very complicated. It is all a matter of quantum mechanics :-(
HTH,
Cheers, Alastair.
Alastair, thanks for waking me up. I once knew photons have linear but not angular momentum; must be the senility setting in :-P . This means the photon momentum manifests in translation movement of the molecule, while photon energy appears in vibration or rotation molecular energy, except for a teeny amount that has to be shared/added to translation to cover the increased momentum. True?
Thanks also for the geometry of collisions explanation.
Re #507
Hi Rod,
The photon has very little momentum and so has very little direct effect on the translation (kinetic temperature) of the molecule. What it does have lots of is electro magnetic energy, and this goes into the bipolar vibrations of the greenhouse gas molecule. It is only molecular vibrations and rotations which produce an oscillating magnetic field that can absorb photons.
For instance CO2 is a linear molecule, and rotation about that line as an axis does not produce an oscillating field, but because H2O is bent, then that rotation is IR active. Rotation of the CO2 molecule about the central carbon atom also produces a field, so CO2 has two IR active rotation modes, actually one – doubly degenerate.
I better not add more as I am almost out of my depth :-)
Cheers, Alastair.
Typically, rotational bands are in the microwave, vibrational bands in IR
Rod, remember that the photon has momentum h/lambda and energy hc/lambda. c is a pretty big number. The photon will also have angular momentum, but since it will not interact much beyond its wavelength (small perpendicular distance), the angular momentum isn’t particularly relevant.
As to equipartition–it does not apply to single molecules, but really to large assemblies of molecules, and if you want to get technical to ensembles (many repetitions) of such assemblies. I think somewhere in my collection of technical books, I have a history of statistical mechanics if you are interested. I’m visiting relatives for my niece’s graduation right now, so I don’t have the title, but it might provide you with a little bit of background on why physicists define things as they do. Energy (or kinetic energy) and temperature are not equivalent concepts. Energy is extensive. Temperature is intensive. Temperature tells us about the flow of energy. It is only when you are dealing with equilibrium systems that you can come up with unambiguous relations. Expecting those to hold for all systems is asking for trouble.
So the massless photon which might or might not even exist has both angular and linear momentum. Who would’ve thunk it! ;-)
So the free market global economy, which defeated world communism, has hit both Peak Oil and the climate change catastrophe. Who would’ve thunk it! :-(
Free market global economy is efficient, not necessarily far sighted!
So we are about to discover!
Ray, since neither one will budge for technical/physics rationale, let me try philosophical: Why can a massless and possibly imaginary entity like a photon easily be ascribed physical characteristics like momentum, angular momentum, velocity, energy, and “characteristic” temperature (ala Planck), and polarization, but individual molecules (real entities with real mass and structure you know), you seem to argue, can not?
Ray, Alastair, et al: A follow-up curious/interest question, re a photon’s angular momentum (L): What axis is its angular momentum about? Would it be like the equivalent of a single rotating sphere with L about its own axis of spin? Second, is it correct to say that an absorbed photon’s L is conserved and has to manifest itself only in molecular rotation? And if about the/a molecule’s central axis of rotation, is that consistent with the photon’s axis? And, since incredibly small, would the added rotational energy come anywhere near the allowable rotational energy levels (though it would seem to fall within the narrow band(s) of the levels..?? — is there a band around the zeroeth level?)? Or is the photon’s L more of a virtual, imaginary, or characteristic quality (none-the-less with effects) and not actually “real” (like electron spin??)? Or does anybody really care?
Rod, relativistically, a photon has momentum becuase it has energy. Look at the relationship of momentum and energy relativistically–E=hc/lambda, p=h/lambda. And, no, a single photon does not have a temperature.
Also, you are getting wrapped around the axle wrt coordinate transformation. It doesn’t matter where you put the axis. The physics (what happens when two bodies interact) won’t change. In your example of the photon and the molecule, the photon won’t interact unless it passes within a wavelength of the molecule–so the change in angular momentum will be within the quantum mechanical uncertainty of the excited state. Again, which axis you take is not the interesting part–the physics is invariant.
Ray says, “…so the change in angular momentum will be within the quantum mechanical uncertainty of the excited state.”
That makes sense. So does your comment re L’s axis. Thanks.
To the main theme, read: “Tipping elements in the Earth’s climate system” (Timothy M. Lenton*†, Hermann Held‡, Elmar Kriegler‡§, Jim W. Hall¶, Wolfgang Lucht‡, Stefan Rahmstorf‡,
and Hans Joachim Schellnhuber†‡): “Society may be lulled into a false sense of security by smooth
projections of global change. Our synthesis of present knowledge
suggests that a variety of tipping elements could reach their
critical point within this century under anthropogenic climate
change. The greatest threats are tipping the Arctic sea-ice and
the Greenland ice sheet, and at least five other elements could
surprise us by exhibiting a nearby tipping point. This knowledge
should influence climate policy, but a full assessment of policy
relevance would require that, for each potential tipping element,
we answer the following questions: Mitigation: Can we stay clear
of crit? Adaptation: Can Fˆ be tolerated?”
Seems to me, that society “may be lulled into a false sense of security” not as much by “smooth projections” (which frankly most powerholders doesn’t even bother to understand beyond the point of politically correct gobbledegook as far as they think that’ll help them catch some voters extra) as by a very modern human tendency to automatically ignore what doesn’t fit into the megalomaniac expectations stemming from the dogmatic and scolastic exercise of economic “science” by the real priesthood of our times ex media cathedra.
Re #1: Certainly mankind will never be able to do the job of
“Laplace’s demon”:
“In the history of science, Laplace’s demon is a hypothetical “demon” envisioned in 1814 by Pierre-Simon Laplace such that if it knew the precise location and momentum of every atom in the universe then it could use Newton’s laws to reveal the entire course of cosmic events, past and future.”
http://en.wikipedia.org/wiki/Laplace's_demon
But that doesn’t mean we can’t say anything about climate change past and present. There are lots of possible positions between total determinism and total uncertainty/scepticism.
As for “the free market global economy” (#512): around 50 pct. of the global economy is internal trade/transactions within less than hundred transnational corporations. Not very much “free market” there, except in the media mythologies, but certainly lots of oligopoly/monopoly capitalism and centralistic planning (without democracy as in the Soviet Union).
> megalomaniac expectations stemming from the
> dogmatic and scolastic exercise of economic
> “science” by the real priesthood of our times
> ex media cathedra.
This deserves some sort of writing award, I think.
What Hank Roberts said in #519.
Hank,
I think he meant “conventional wisdom.” If so he is right :-(
Cheers, Alastair.
Dear Hank and David,
He must have been practising for the Bulwer-Lytton Contest.
Not exactly on topic, but relating to much of the above discussion:
I have found your site is very informative despite the technical nature of much of the discussion, even for someone like me with no science since high school (I’m not counting math, economics and statistics).
I would like you to commend you for your efforts in dealing with the distractions of the so-called “skeptics”.
While you will clearly never convince them (there comes a point when the distinction between ignorance and wilful ignorance can no longer be overlooked), your patient response to their faux-naif questions and repetitive red-herring-dragging is not to no avail.
It has been less than a month since I started trying to educate myself about climate change.
In this short space of time I have reached the point where even I can spot almost all the inconsistencies and fudges in posts from the gusbobbs of this world, without having to wait for your expert replies. I am sure there are many others interested non-experts in my position.
While dealing with such distractions have no doubt interfered with this site’s function as a clearing house for genuine ideas exchanged between experts in your field, it has been beneficial in highlighting the paucity of many of the wrong-headed arguments being bandied around.
If nothing else, the quality of one point of view can often be judged by the quality of the arguments ranged against it.
Thank you.