Imagine for a moment if Microsoft had 24 competitors around the world, each building their own version of Microsoft Word. Imagine further that every few years, they all agreed to run their software through the same set of very demanding tests of what a word processor ought to be able to do in a large variety of different conditions. And imagine that all these competing companies agreed that all the results from these tests would be freely available on the web, for anyone to see.
Then, people who want to use a word processor can explore the data and decide for themselves which one best serves their purpose. People who have concerns about the reliability of word processors can analyze the strengths and weaknesses of each company’s software. Then think about what such a process would do to the reliability of word processors. Wouldn’t that be a great world to live in?
Well, that’s what climate modellers do, through a series of model inter-comparison projects. There are around 25 major climate modelling labs around the world developing fully integrated global climate models, and hundreds of smaller labs building specialized models of specific components of the earth system. The fully integrated models are compared in detail every few years through the Coupled Model Intercomparison Projects. And there are many other model inter-comparison projects for various specialist communities within climate science.
Have a look at how this process works, via this short paper on the planning process for CMIP6.
Prokaryotes, thanks for the reference. I know that this is off topic but in the spirit of your post I recommend Open Office for competition with Microsoft Office. It is free and completely open source. I have been using it for several years.
Which projection for atmospheric CO2 concentrations would you recommend? I’m looking for a projection that has been roughly accurate so far. Currently I’m using RCP 8.5 but that scenario might be too high on e.g. coal use later on, even with no efforts to reduce CO2 emissions directly?
The few other projections I have looked at, including RCP 6, haven’t been that accurate so far and e.g. have too low population estimates.
The projection is for the third chart found at the link below. (The page is an attempt at a succinct explanation of global warming for a private discussion — suggestions are welcome).
Hopefully, this is the place to make my problem your problem. Here’s the problem:
It’s an article from Climatewire reprinted in Scientific American: http://www.scientificamerican.com/article/pollution-sours-pacific-ocean-more-than-expected/
“Pollution Sours Pacific Ocean More than Expected”
It’s explained fully in the article: the rate of soluble CO2 in the tropical Pacific Ocean waters is rising at a rate 50% higher (in places) than the atmospheric rate (3.3ppm versus 2ppm air). The data set is from seven buoys with measurements starting in 1998. ENSO and decadal signals have started to emerge, and the JISAO has determined that the water emerging with combined anthro and natural CO2 in the mix left the surface about 10 years ago.
The problem, and there’s a wide variety of swag’s available, is trying to visualize mechanisms that would result in the water return upwards, warming up, and reaching the surface with an extra concentration charge of CO2.
Whatever comes out in the form or research or the next round of studies, it’s a puzzle that hints of an unsuspected reservoir of acidified ocean.
3: “Wait, isn’t using all of them more accurate than using any of them?”
Is using newtons laws and relativity to solve the same problem and averaging the results more accurate than using them separately?
Well… No it isn’t, is it?
Comment by Vendicar Decarian — 29 Mar 2014 @ 3:04 AM
Can anyone answer the following?
What is the typical simulated time interval dt, over which global climate models are integrated. Is it on the order of hours, minutes, or seconds?
Do the models wait for the slowest cell to compute before moving on to the next iteration (strong coupling), or are the cells just computed based on the best available inputs from neighboring cells at the moment of the calculation (weak coupling)?
Comment by Vendicar Decarian — 29 Mar 2014 @ 3:20 AM
El nino is said to be forming – if it turns out to be a large one like 1998 what would it mean for us all. what predictions/projections sitting on a bed of 0.8C would make it any difference to the 1998 one?
First, thanks for splitting this thread out from the clutter. I’d given up on the open forum for this month, and was resigned to waiting for a clean start in April – though now is risk of losing the conversation as the month shifts again…
After a winter with arctic ice trends that have fallen to previous record lows, a shift into El Nino mode, with release of much of that stored heat would have all the more drama in impact. As an itinerant observer here, I’m in no position for credible predictions of the climate science, but I can imagine that combination leading to one of those jagged blips that resets the climate escalator, and becomes the new cherry to pick as baseline for the next inexplicable hiatus, that proves we all are nuts to think its getting hot in this pot of water (on the stove, fire set at ‘high’).
For what’s worth, I’m conflicted between climate “confusionists” vs “confusioners”. Derivations of ‘denial’ all beg for argument, while ‘confusion’, at least on surface, implies some hope for sorting things out. My little niche of science is in human behavior, where is not uncommon to find that words we use can have real power.
Hi. Let me say first that I am not a scientist, although I do have a 1970 B.Sc. in physics. I’m a moderately regular visitor here, and am very interested in climate, the atmosphere, energy and much more.
To that end, I’ve just finished some videos that compare Global Forecast System surface wind and total cloud water model animations with EUMETSAT cloud imaging animation for the months of November and December 2013. I made the model animations using the visualizations on Cameron Beccario’s earth site.
The most up to date consensus from global climate models predicts warming in the Northern Hemisphere (NH) high latitudes to middle latitudes during boreal winter. However, recent trends in observed NH winter surface temperatures diverge from these projections. For the last two decades, large-scale cooling trends have existed instead across large stretches of eastern North America and northern Eurasia. We argue that this unforeseen trend is probably not due to internal variability alone. Instead, evidence suggests that summer and autumn warming trends are concurrent with increases in high-latitude moisture and an increase in Eurasian snow cover, which dynamically induces large-scale wintertime cooling. Understanding this counterintuitive response to radiative warming of the climate system has the potential for improving climate predictions at seasonal and longer timescales.
Comment by Pete Dunkelberg — 29 Mar 2014 @ 5:39 PM
Surprisingly, tamino doesn’t appear to have said much about Pielke on 538 — and is precisely the kind of data analyst we might hope Silver might be interested in hiring.
Comment by Pete Dunkelberg — 30 Mar 2014 @ 9:17 AM
Is this getting interesting? click through the previous years for comparison. Starting with at least the last 45 days of 2013, the overall Arctic temperature has stayed above the long term average for months. This looks like a record.
What boosts Arctic temperatures during the long polar night? Putting a lid on it. Under clear dry Arctic winter skies our earthly energy flows back into space unimpeded. Given cloud cover the Arctic temperature soars, sometimes above 260 K.
Comment by Pete Dunkelberg — 30 Mar 2014 @ 9:59 AM
Sorry Pete, I don’t find it now, searching
maybe I dreamed it or the indexing hasn’t caught up
In passing though, that search happened to turn up this reply to a question (that I asked both here and there) that I think is worth more attention from the physicists:
BillD | March 13, 2014 at 4:48 pm |
Hank—as an ecologist, I have to say that changes in seasonal timing (phenology), changes in populations, the impacts of changes in thermal stratification and ice cover (in lakes and oceans) and are very strong signals, stronger than what most ecologists (including me) would have expected from the temperature record. Sometimes I get a bit irratated when physicists suggest that temperature records are the only evidence. Evidence for rapid climate change from changes in ecology are very strong and, of course, completely independent of the temperature records.
I know Jim Bouldin can comment on that, maybe over at his blog, but — physicists! phenology!
The most sensitive instrument ever evolved for detecting changes in physical systems is: living systems. As Terry Pratchett wrote:
It is often said that before you die your life passes before your eyes. It is in fact true. It’s called living.
There are “Bioblitz” events trying to find everything living in an area that’s visible, given enough enthusiastic amateur eyes, just as a census. Do that repeatedly and the changes in the world will leap out at you.
I’d like to follow up pete best’s and Phil Mattheis’ post with a humble suggestion that there be a main post on all things El Nino. This is likely to be the biggest climate story over the next two years, and it would be nice to be up to date on the latest in prediction, causes, consequences, and connection with GW.
As for Minnesota, I recently saw the snarky headline “Evidence of Liquid Water on the Surface of Minnesota!” We are indeed starting to thaw out after a long by recent standards, but standard by longer comparisons, winter. 60F predicted for today!
BeezelyBub wrote: “… extinction deniers on the left … progressives seem to magically think … A lot of urban people think that … Renewable energy is a huge business run by mega corporations like General Electric, the same folks who brought us Fukushima … Solar manufacturing plants produce 500 tons of hazardous sludge each per year …”
So you set up a Diogenes-free thread, and along comes another boorish ranting troll spewing interminably verbose incoherent drivel full of attacks on cartoon strawmen (“extinction deniers”, “progressives”, “urban people” — oh my!), inanely contentious non sequiturs (renewable energy = big business = Fukushima) and ludicrous falsehoods.
The moderators have my sympathy.
[Response: We are trying to keep this thread tedium free as well – that one slipped by. Thanks – gavin]
Comment by SecularAnimist — 30 Mar 2014 @ 12:07 PM
Re: #449 old March UV thread; Within the context of discussions unbounded by known physics occurring upon policy threads, wherein I spend much time, this peek at contemporary contrarians (who seem so constrained), was for me, quite helpful. Thank you for flagging it, MARodger.
I have remarked on the possibility that reduced expansion due to deeper ocean warming is masking increasing contributions to sea level rise from ice melt. I see that the usual suspects are working on it
Unfortunately they only use 0-700m data for the period 1994-2006, and Argo down to 1500 m thereafter. The estimated error in the thermosteric component is quite large (1.4 mm/yr)
The paper is more concerned with removing shorter term (ENSO and other) fluctuation from sea level rise measurements, briefly they find around 3.2 mm/yr over the last two decades from satellite altimetry.
> isn’t using all of [the models] more accurate
> than using any of them?”
Serious question, despite VD’s snark; I think I recall this being discussed a while back in comparing models, that the combined result is closer to matching reality. Anyone help me check that recollection?
Here is my big picture question: Given that the earth was beginning a cooling phase until AGW, heading toward an ice age in roughly 1500 years, at what point will the diminishing sunlight intensity overcome the insulation provided by the excess CO2? Beyond that point will the cooling by lowered solar inputs correct for the anomaly quickly or will it just delay the onset by a couple hundred years?
I have remarked on the possibility that reduced expansion due to deeper ocean warming is masking increasing contributions to sea level rise from ice melt.
I’ve wondered about that, and your comment impelled me to investigate. I had understood that water isn’t compressible, so I thought the same increment of heat should cause the same increment of expansion regardless of depth. According to the USGS:
But, squeeze hard enough and water will compress—shrink in size and become more dense … but not by very much. Envision the water a mile deep in the ocean. At that depth, the weight of the water above, pushing downwards, is about 150 times normal atmospheric pressure (Ask the Van). Even with this much pressure, water only compresses less than one percent.
Seems like water compression would be a small component of the SLR noise, no?
Lambert and Boer (2001) show that for the CMIP1 ensemble of simulations of current climate, the multi-model ensemble means of temperature, pressure, and precipitation are generally closer to the observed distributions, as measured by mean squared differences, correlations, and variance ratios, than are the results of any particular model.
Brian, 1) sunlight intensity will not diminish. That’s not how it works. Instead it works like this, sort of. Summer sunlight in the northern hemisphere (NH) diminishes and increases in the southern hemisphere. Further cooling (if we were doing that) would come from, among other things, an expanding summer NH snowline, hence greater albedo for the planet. (The SH has much less land area, especially since Antarctica does not count for this purpose. It stays icy anyway). Changes in CO2 also occur. A colder ocean absorbs it for one thing.
NH summer snowline is receding due to warming from CO2. No ice age is coming, or at least not for 125 k years. This was probably settled by around 1900 or not much later.
Comment by Pete Dunkelberg — 30 Mar 2014 @ 4:47 PM
@29BrainFoster – “heading toward an ice age in roughly 1500 years” There’s no expected Ice Age in ‘roughly 1500 years’. The Holocene Inter-glacial stability was already an anomaly before civilization added to the mix. It didn’t reach a ‘bishop’s peak’ and fall back – it flattened out about 7,500 years ago.
“at what point will the diminishing sunlight intensity overcome the insulation provided by the excess CO2?” It won’t. The 6dC swing has been shifted upwards by almost 2dC (so far).
The last glaciation period, post-Eemian started with approximately 400 years of ‘sudden’ cooling (possibly volcanic driven). It was followed by stuttering colder and drier and icier, with reverses to warmer intervals, that lasted about 40,000 years. GHG levels are now higher than they were at the start of the first glacial period, the Huronian, 2.4 million years ago. Long before any cooling that could produce a serious glaciation period, the concentrated Greenhouse Effect will serve up Hothouse Earth and Garbage Dump Earth.
I doubt that the CMIP group is doing it wrong. Perhaps the clarification you need is by now so obvious to the CMIP group and to Steve that it went unsaid.
Steve is reasonably well acquainted with the subject and may know what he is talking about.
“Well, that’s what climate modellers do, through a series of model inter-comparison projects.” he says and links to a paper about the next round of doing this. Check it out.
So maybe your objection is muddled. First, distinguish between averaging all models (when the purpose of CMIP is pick out the best one(s) or best for a particular purpose, and averaging multiple runs of the same model with different initial conditions, and using model A to study process a when A happens to be quite good at a and model B is not.
Comment by Pete Dunkelberg — 30 Mar 2014 @ 5:59 PM
Thanks Rick Brown for that link ;)
plus Tweet from Mike Mann
Michael E. Mann @MichaelEMann
Disinformation campaign to fool public into thinking new #IPCC report downgrades risk (it does opposite) is something to behold. #Immoral
Comment by Pete Dunkelberg — 30 Mar 2014 @ 6:10 PM
the hypothetical economic costs associated with climate events can be reduced by calculating the probability of the event across the ensemble, rather than using a deterministic prediction from an individual ensemble member.
That’s the reference I was recalling.
Same point; picking one model to affirm the answer you want is less accurate than using all the available information to get closer to what’s real.
For example, I analyzed the tropical atmospheric temperature change in 102 of the latest climate model simulations covering the past 35 years. The temperature of this region is a key target-variable because it is tied directly to the response to extra greenhouse gases in models. If greenhouse gases are warming the Earth, this is the first place to look.
In a rather disconcerting result, I found all 102 model runs overshot the actual temperature change on average by a factor of three. Not only does this tell us we don’t have a good grasp on the way climate varies, but the fact all simulations overcooked the atmosphere means there is probably a warm bias built into the basic theory _ the same theory that we’ve been told is “settled science.” I don’t know about you, but to me, being off by a factor of three doesn’t qualify as “settled.”
Anyone seen this result? Any response? Gavin? This should be right in your wheelhouse…
[Response: There are a number of points worth making. First, the ‘settled science’ trope is just a rhetorical strawman (see my 2009(!) comment on same). Second, his ‘for example’ is another rhetorical flourish to imply that many other aspects show the same thing – this is simply not true. There are countless metrics that one could look at – and at best we would expect some to be higher, some lower and a lot about right (which we do). Only pointing to one metric that is lower than observed is cherry-picking. But all of that is somewhat eclipsed by the fact that Christy’s figure is not properly published (and so it is not really clear what he is plotting), and has already been called out for inconsistent smoothing, selective baselining and the lack of error bars on the observations (which are large). A clearer view of how tropospheric trends in the models match the observations is seen in Santer et al (2012), in particular fig 3. – gavin]
| What is the typical simulated time interval dt, over which global climate | models are integrated. Is it on the order of hours, minutes, or seconds?
It typically varies across the model components. Eg. in the EC-Earth model, we had 10 minutes in the atmosphere, 20 minutes in the ocean, 1 hour in the sea-ice model.
| Do the models wait for the slowest cell to compute before moving on to the next iteration
|(strong coupling), or are the cells just computed based on the best available inputs from
| neighboring cells at the moment of the calculation (weak coupling)?
I am not aware of any CMIP5-class models that do this “weak coupling” method. Various load-balancing methods are used to average-out the workload per node to avoid slow cells delaying the overall model.
Comment by Alastair McKinstry — 31 Mar 2014 @ 9:51 AM
Climate sceptics successfully deleted the German Wikipedia article on the scientific consensus on climate change. There is currently a discussion/attempt to revert the deletion (German).
URL is fine, but does not the table show an increasing coefft of thermal expansion with depth?
Take the numbers for 5ºC @ 3.5% salinity.
I assume the physics works like that because, while the water compresses with pressure/depth, it still requires the same wiggle room to contain the extra thermal energy.
The graph down this link shows the region of the temperature/pressure graph where ‘thermal expansivity’ increases with pressure for pure non-salty water. Salinity is however a significant factor.
Actually most of the warming is due to reduced sea ice in the Bering Sea and the Atlantic ice edge, not due to clouds. The warming over the pack itself looks fairly normal, there are two reasons for this: 1) disruption of the climatological atmospheric inversion, 2) thinner sea ice allowing greater heat flux through than over most of the climatology mean period.
The top right quarter shows the strongest warming, this is due to reduced ice cover around Svalbard. Because there was typically always extensive ice in that sector the baseline average is something of the order of -20 degC. The ice insulating the atmosphere from the ocean. At present the ice edge in the Atlantic is further north than average, so open ocean is keeping temperatures much warmer than average.
The temperature is not constant from surface to depth. at 4000 bar the temperature might be -2C while it is 30C at the surface. The salinities also change but i think 35 parts per thousand is a good place to start.
#37 – Dave Erickson – Christy and Spencer are promoting a new graph of satellite vs model results. Spencer presented the graph with his written testimony at a hearing last 18 July before the Senate environment and Public Works committee. See Figure 2 [HERE].
Notice that Spencer uses tropical mid-troposphere satellite data, which I take to mean their TMT product. Christy’s latest version adds more model runs and averages the model runs into a single curve, then reduces the Y-axis scale a bit to make the model results look larger. Trouble is, the TMT includes a cooling influence from the stratosphere, a long known problem which was the reason they developed their TLT back in 1992. They also show radiosonde data, which may have been adjusted by simulating TMT radiances at the TOA. If so, we can only guess whether they applied the same “adjustment” to each model run, since they provide no references for their efforts. I suspect they didn’t, which would thus make the figure appear to show a great difference between the model results and the data. They also do some other funny things, such as adjusting the curves so that the trend lines cross at 1979, which forces the model results upward more than the TMT and sonde data sets, given that apparent greater trend in the model results. Spencer claims to have applied a 5 year running average on the TMT and sonde data, then shows 34 data points in the graph. But, curiously, a proper moving average must reject 4 years of data out of the 34 years available from 1979 thru 2012.
I think that this is all a repeat of their claim that there is a “missing warming” in the upper tropical troposphere, a claim which has seen considerable debate, as discussed in Santer et al. (2012).
Chris @ 42, thanks mucho for that analysis! So the unusually high base line for Arctic temps (perhaps starting sometime last November?) is the warmth near Svalbard. The high jumps in Arctic temperature, which occur nearly every winter, must still be due to cloudiness? And last year’s cold spell around day 50 would indicate very clear skies?
Comment by Pete Dunkelberg — 31 Mar 2014 @ 9:32 PM
That is a very helpful observation. I was thinking the deep ocean has such a small temperature gradient that it would have no impact. But…
Digging the numbers you linked to @38:-
Surface,… 23,….. 51,…. 114,… 207
2000m,….. 80,… 105,… 157
4000m,… 132,… 152,… 195
6000m,… 177,… 194..,.. 230
8000m,.. …..,….. 231,…. 246
10000m,.. …..,… 276,…. 287
At depth, say 2,000m to 6,000m where there is still large ocean volumes, the temperature gradient(as per this Wiki graph) is, what, 1ºC per 1000m? If so, the coefft of thermal expansion will be still increasing with depth but by a significantly less amount.
Importantly though, through the top 2000m where there is by far the biggest temperature gradient, the coefficient does reduce with depth because of that temperature gradient – dropping possibly by 20% or so. And of course, that is the very bit of the ocean presently seeing the big increases in OHC.
So with that caveat, I would sign up to the coefft of thermal expansion in the oceans reducing with depth.
I’ve argued here, in line with some cited research that in the Siberian region of the sea ice pack thinning of ice is warming the atmosphere in winter.
Cold snaps should be expected during high pressure due to the development of surface based inversions driven by infra-red emission through clear sky. Day 50 was in a period of strong high pressure across the Arctic Ocean, the centre of action being poleward of Beaufort.
Yes clouds do back radiate infra-red and this will cause warming, cloud levels (particularly low clouds) are increasing over winter (of the order of 1% per decade). But I haven’t read that this is a substantial factor in winter warming – prepared to be educated on that. The low level signature of warming seems to me to indicate thinner ice and more leads releasing heat. As I show in the above-linked blog post the anomaly of warming is deeper in the atmosphere than the corresponing inversion, so it does not seem to be solely due to the inversion being absent.
Could someone perchance have an off the top hint towards a trail to a citation for the idea attributed to MTI’s RD Lindzen, that the Manabe coefficient for water vapor amplification is more than halved by aggressive collisions with excited nitrogen? Apparently, heretofore under-scrutinized spectral characteristics of rarely achieved states of post-collision H2O, wherein the normal bond angle is eliminated via a temporary status of quantum-syzygy, deprives the molecule of its dipole moment, eliminating a total of six rotational bands, and thereby restricts certain of the small windows in the near IR, thus dramatically reducing Clausius-Clapeyron feedback.
“As the late John Sawhill…my friend, once said: “A society is defined not only by what it creates, but by what it refuses to destroy.” (48:28)
“The complexity of its [the living world’s] layered structure and the billion-year history of its construction lie beyond anything that can be unraveled or copied by us at this time. A warehouse of parallel zettabyte computers could not simulate it. The starting conditions might be guessed. Of that we can never be sure. But the magnitude of the events and all the lives of the species reverberating back and forth across the levels of biological organization, from macromolecules to ecosystems, that create natural selection through an all-but-infinite maze of possibilities, is beyond the comprehension of our still mostly paleolithic brain…” –Wilson citing his new book [April ’14 ] (48:40)
Wilson’s thoroughly coevolutionary perspective is not unlike that of Antonio Domasio, in neuroscience.
Comment by Pete Dunkelberg — 2 Apr 2014 @ 12:16 PM
Uh, yeah, quoting from the Nature link Prokaryotes posted:
… a fresh picture of how the extinction occurred. Leading up to it, large quantities of organic matter accumulated in ocean sediments. It was “a pile of food sitting there”, says study co-author Gregory Fournier, an evolutionary biologist at MIT, but nothing was in a position to eat it.
That soon changed. The oceans were also home to single-celled microbes known as archaea. Some, known as Methanosarcina, consume carbon compounds and release methane. But the microbes did not have a way to process acetate, one of the key compounds that made up the sediment reserves. That is, until they captured two acetate-processing genes from a bacterium. By comparing the genomes of 50 different living organisms, Fournier dated that gene transfer to 250 million years ago, right around the time of the mass extinction.
Long long ago (back before electronic calculators had square root buttons) I heard a biologist talking about how he had wasted a handful of years in his research trying to do mass selection on microbes to find something capable of digesting DDT. Yeah, this was back in Rachel Carson’s day, if that helps.
He took samples from farm dirt and yard dirt and city dirt, grew whatever he could get to grow in various culture media, in lots and lots of Petri dishes in various containers.
He challenged them with DDT, killed off 99.99 percent, washed out the plate, and let what survived grew out again.
Over and over.
And — lo and behold — he found after a few years he had some beastie that indeed did eat DDT and thrive.
You’d think that was a great discovery.
But he was a biologist. So he didn’t leap into commercial production.
Instead he started chopping dicing grinding other persistent stable organic materials and feeding those to his newly selected microbe, to see what _else_ it was going to be able to eat besides DDT.
Turned out it would happily eat polyvinyl chloride, PVC.
Yeah. Telephone wire insulation. It would munch right along any piece of PVC, riddling the material til it fell apart.
He autoclaved all his samples, sterilized the lab, and — warned the rest of us wide-eyed young biology students about the risks of pushing selection really hard, if we didn’t know what we would be selecting for, or selecting against.
He was smarter than the average hominid.
Now, we’re doing the same experiment he did, in the oceans.
Plastic soup, microbes, persistent organic chemicals, heat, aeration, pH change.
This year, NOAA issued its first forecast on 6 March, estimating a 50% chance that El Niño will develop this summer. But that early projection, and others from weather agencies and research institutions around the world, comes with lots of uncertainty. Fickle tropical winds in spring can easily quash a brewing El Niño — or strengthen it.
Researchers say that real progress in forecasting has come from systematically comparing the outputs of groups of models, with each simulation run under a range of possible climate conditions. “Combining these various predictions — doing some crowd-sourcing, if you will — tends to lead to more reliable predictions,” says Gabriel Vecchi, a climate modeller at NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey. Averaging the results of several different forecasting models tends to cancel out flaws in an individual program, he says….
… Since 1998, the eastern Pacific has been in a cold phase that is associated with La Niña-like conditions, but every 15–30 years, as part of a cycle known as the Pacific Decadal Oscillation, that trend flips. Kevin Trenberth, a climate scientist at the US National Center for Atmospheric Research in Boulder, Colorado, has theorized that a major El Niño could help to push the ocean back into a warm phase, which studies have linked to more frequent El Niños and more rapid global warming (see Nature 505, 276–278; 2014).
But all of that depends on what happens as warm water washes across the Pacific in the next couple of months. “The system is primed,” says Trenberth. “Will it wimp out or really take off?”
After at least 30,000 years trapped in time, a frozen virus has thawed from a deep layer of Siberian permafrost. This is no ordinary virus — it is the biggest virus ever found on Earth, and thought to be part of a new type of “giant virus” that can be larger than many bacteria or even cells, and is visible in a normal microscope. With climate change warming the planet and taking the “perma” out of permafrost in many polar regions, scientists may begin to unearth more previously undiscovered viruses …
Pangaea: This configuration severely decreased the extent of shallow aquatic environments, the most productive part of the seas, and exposed formerly isolated organisms of the rich continental shelves to competition from invaders. Pangaea’s formation would also have altered both oceanic circulation and atmospheric weather patterns, creating seasonal monsoons near the coasts and an arid climate in the vast continental interior.
Research indicates that recovery did not begin until the start of the mid-Triassic, 4 to 6 million years after the extinction
Interestingly, plants are relatively immune to mass extinction, with the impact of all the major mass extinctions “negligible” at a family level
It has been suggested that new, more aggressive fungi, insects and vertebrates evolved, and killed vast numbers of trees.
Clive Palmer is to decide future of Australian carbon tax
Lateline presenter Tony Jones: Clive Palmer, can I ask you a very basic question? Do you believe the consensus scientific view set out in the latest IPCC report that climate change impacts due to global warming will have especially serious impacts on Australia?
Clive Palmer: No, I don’t believe that’s so. There’s been global warming for a long time. I mean, all of Ireland was covered by ice at one time. There were no human inhabitants in Ireland.
That’s how the world has been going over millions and billions of years and Ross Garnaut knows that’s true, so I think that’s part of the natural cycle.
When asked who he would take advice from in the field of climate change, Mr Palmer deflected the issue.
“Well, I can get a group of scientists together, Tony, and pay them whatever I want to and come up with any solution. That’s what’s been happening all over the world on a whole range of things,” he said.
Mr Palmer said scientists should be focusing on the 97 per cent of carbon dioxide that comes from nature.
“It’s not logical. If we say it’s – 97 per cent comes from nature and we don’t even bother examining how we can reduce carbon in nature, just in industry, it’s not a proper balance,” he said.
“I mean, if we say we want to reduce it by 1 per cent, which I think is the target globally, to do that, why can’t we take some from nature, some from industry, or maybe all from nature?”
Over the last 40 years, there have been only two major El Niño events (1982-1983 and 1997-1998).
Buoys deployed in the tropical Pacific Ocean, beginning mostly during the late 1980s and early 1990s, help scientists track how El Niño events are progressing. Since these buoys were mostly not available during 1982, we only have detailed data from one major El Niño event to analyze….
… Our tiny sample says only that 100% of 1 similar event did so. We have never observed a similar sized wave that was not followed by a major El Niño event, but we don’t know what fraction of a large number of similar events would have similar outcomes.
Another way to anticipate possible outcomes following a big Kelvin wave event is to consult computer models that attempt to simulate the physics of the event. However, none of our models seem to simulate well the development and evolution of strong Kelvin wave events. My own assessment of models statistics suggests that these models tend to under predict the amplification rates of strong El Niño events early in their lifetimes when such high amplitude Kelvin wave events occur.
So, although I do not claim to be certain that a major El Niño event will develop this year, I think that the probability of a major event is substantially higher than that suggested by NOAA’s Climate Prediction Center (their most recent forecast suggests a 50% chance of El Niño developing at all, while I suggest that the chances are probably closer to around 80% for a major El Niño). My motivation to assert higher confidence is that the wind patterns presently observed that favor strong El Niño growth are favored to continue to occur: these wind signals are not completely independent from the progress toward El Niño, so the present progress towards El Niño favors their continued occurrence.
Trouble is that in our lifetimes, we are not likely to ever experience a large enough number of major Kelvin wave events similar to the present event and that of March 1997 to be able to objectively assess through the historical record which of us is right, whether or not this potential event becomes major.
“… the article categorizes the behaviour of identifiable individuals within the context of psychopathological characteristics.”
The journal didn’t disagree with the diagnoses
The editors decided to retract after deciding that people who signed names to blog posts deserved anonymity when their writing was characterized.
Wait, what? Hm.
I guess writing by pseuds, sock puppets, and anonymice can’t be tracked — can’t know who’s who — so those writers lack character (no way to be sure a collection of blog comments are actually by the same person using a pseud).
Hm, maybe using tracking info that’s not public — IP addresses? — plus the sort of textual analysis used to identify anonymous authors of other works — would be acceptable to the journal.
Truly terrifying juxtaposition of ideas in the comments here.
Hank Roberts #65.
He autoclaved all his samples, sterilized the lab, and — warned the rest of us wide-eyed young biology students about the risks of pushing selection really hard, if we didn’t know what we would be selecting for, or selecting against.
He was smarter than the average hominid.
Indeed he was.
Now we have the Lateline quote from (the very, very wealthy) Clive Palmer in John Byatt #74.
Well, I can get a group of scientists together, Tony, and pay them whatever I want to and come up with any solution. That’s what’s been happening all over the world on a whole range of things.
I will damn well GUARANTEE you that if push really comes to shove, politics as well as business attitudes will ensure that any “solution” worked out by the technologists to ameliorate AGW will NOT be checked for side-effects before being applied.
“The editors decided to retract after deciding that people who signed names to blog posts deserved anonymity when their writing was characterized.”
Well, no, their first official statement was that they retracted due to fear of lawsuit.
When they got significant pushback from the research community for caving in to pressure despite their statement that the work was fine (including on ethical grounds), an editor wrote a *blog* post (not an official statement, thus far, at least) claiming it was for the ethical reasons now being trumpted.
So, we don’t really know.
Either the first, official statement was an outright lie, as it specifically said no ethical issues had been identified, or the “clarification” is one person’s opinion. For the reputation of the journal one would hope it’s the latter case …