I’m also attending the EMS2011, and the underground from the hotel to the meeting is much shorter after I got this magazine. The great part of this magazine is the bridge building from science to the public, we need more of this!
Thanks for the heads up on this new magazine.
Typo department: the name of the comparable magazine you mention is “National Geographic” and the link is http://www.nationalgeographic.com/
Comment by Pete Dunkelberg — 16 Sep 2011 @ 7:13 AM
“weather motives” should be “weather motifs”
Comment by Brian Brademeyer — 16 Sep 2011 @ 8:11 AM
I have not posted on RealClimate for a while now.I’n a layman who reads a lot about climate change on the net.I went to Harvard,1982, and Boston College Law School, 1987.Although this is off topic I just wanted to say that my reading indicates that the Earth has an excellent chance of reaching 1000 ppm co2 by the year 2150.This co2 concentration may remain for thousands of years.With everything we are doing to combat this it still seems likely.
Mark J. Fiore email@example.com
Boston College Law School, 1987.
Forgive me if this has already been discussed at RC, but last week, Nature’s editors, building on a Nature news article, called for more explicit linking of specific weather events to climate change. When, for example, the Wall Street Journal online columnist James Taranto repeats his same old joke about Al Gore delivering a global warming speech on an outlier of a really cold day, hardeeharhar, scientists have usually answered: Weather isn’t climate. The editors, however, say that the time has come to begin trying to establish the weather-event/climate-change linkage where possible. The news article even quotes Gavin Schmidt. To me this seems like a major shift. Is it? Will RC engage it? (The editorial appears at http://www.nature.com/nature/journal/v477/n7363/full/477131b.html and the article at http://www.nature.com/news/2011/110907/full/477148a.html .)Thanks.
Comment by Steven T. Corneliussen — 16 Sep 2011 @ 9:34 AM
As a Brit, I should know all about this magazine.
But, it’s the first time I have heard of the mag and the club.
[Response: That is my impression. I’m subscribing to the magazine in Norway, and encountered no problems. In fact, when registering and giving your address, you can choose from a long list of nations. -rasmus]
Please forgive this old question: About weather and climate – it seems that an oversimplified statistical standard is creeping in: Is there some sort of standard that it takes 30 years of weather to define climate?
At a recent conference I asked a media weatherperson about climate – asking whether any one weather event was climate based, I learned that he thought climate was defined by the last 30 years of weather.
Should there be some sort of rule here? Can we devise a simplified rule that can respect the data trends?
I found some wonderful discussions here
[Response:Climate can be regarded as ‘weather statistics’, of which the occurrence of record-breaking events is naturally one subject. The longer the series of observations are, the lower the chance of seeing a new record-breaking event given the process is iid. Even so, they will still happen from time to time, and if there are ‘too many’ record-events, then this is improbable given an iid-process. You can have many parallel (contemporary) observations and the chance of seing at least one new record amongst a volume of observables may remain high even after some time. But it’s easy to calculate the probability of seeing a number of new records from binominal law if you know the true degrees of freedom. -rasmus] http://www.realclimate.org/index.php/archives/2005/08/on-record-high-temperatures/#comment-3644
Discussing the weather events and then noting the ‘great storm of 1703′ is a curious way to show climate impact on weather.
So far the many assertions that bad weather events are due to climate change have seemed more a demonstration that climate change is a non-science. The only thing more insulting to science is the attempt to show the plight of polar bear cubs to put a warm and fuzzy face on science.
Ultimately, there is a connection that will become apparent, but it will not be convincing if something worse happened 50 years ago, 200 years ago etc.
I welcome the “note” of the great storm of 1703 and any interesting event, linked or not linked to climate change, be it… its going way too far that denialist nobrainer propaganda should influence what ist selected to educate seriously interested people.
Jim, surely information about what has and hasn’t happened before is the core of what the climate/weather distinction is all about. The “great storm of 1703″ and many other noteworthy events are all interesting because they were outliers in our day to day experience of weather/climate.
When what was once an outlier in weather records becomes common, or at least more frequent, we notice it personally, and scientifically it signals a change …. in climate.
We need to repeat (and repeat and repeat) some simple statements which use the term “climate change” to strengthen its meaning. e.g.
Weather causes floods but climate change makes them more frequent.
Weather creates hurricanes. Climate change destroys some but makes the rest worse.
Weather causes droughts but climate change makes them worse.
We now use “climate change” to mean more than the juxtaposition of “climate” and “change”. So we should. It’s so much better than “AGW”. Let’s make sure it doesn’t go the way of the term “sustainable”, which now means almost anything.
@14 and 10:
Weather causes floods (cyclones/droughts/etc) but climate change makes them more (less) frequent, and most of us here know that. And climate is defined in terms of a 30-year span, and again most of us here know that.
But if climate begins changing really quickly, then at some point the thirty year span will actually be too *long* to be a useful average (just one obvious example: Arctic sea ice. Over the last thirty years, its typical summer extent is not at all what we expect over the next five or ten.)
How do we then define ‘climate’?
“NOAA’s National Climatic Data Center (NCDC) released the 1981-2010 Normals on July 1, 2011. Climate Normals are the latest three-decade averages of climatological variables, including temperature and precipitation. This new product replaces the 1971-2000 Normals product. Additional Normals products; such as frost/freeze dates, growing degree days, population-weighting heating and cooling degree days, and climate division and gridded normals; will be provided in a supplemental release by the end of 2011.”
the climate normals are defined so that people have common numbers to which to compare the weather events. yes weather extremes happen and shorter averaging periods may be useful, but then one should always say ‘the rain (weather attribute) averaged over the past five years is outside the statistical confidence level of the weather averaged over 30 years (climate)’ or something similar, it’s easier to say ‘the weather is showing signs of climate change.’
off topic:recently saw a graffiti that said ‘F–k the World’
You may if you wish choose to define the term “climate” to clarify its use in specially restricted environments but, like most terms in any natural language, it is a word with a meaning that we learn from its use and as its use changes our understanding of its meaning changes.
“Definition” itself has several meanings. The case of the term “climate” meaning something like “climate defined as a 30 year span of certain measurements” is an interesting one. You are already wanting to change this special and restricted meaning to something more useful – even in our limited sphere of climate scientists and climate nerds.
If we wish to influence the wider public, I suggest we use the meaning that they are using and this may change too (e.g. does climate include earth tremors?). The struggle for some sort of “climate truth” will also include a battle for the for meaning of words – and out there any “definitions” laid down by scientists and nerds. The word “climate” dates back many centuries and predates our efforts.
We should get out there and make truthful and meaningful statements on our topic that use meanings as they are currently understood.
I am not a particular fan of Wittgenstein but consider what Wikipedia says about his view of definition
Wittgenstein’s point is not that it is impossible to define “game”, but that we don’t have a definition, and we don’t need one, because even without the definition, we use the word successfully. Everybody understands what we mean when we talk about playing a game, and we can even clearly identify and correct inaccurate uses of the word, all without reference to any definition that consists of necessary and sufficient conditions for the application of the concept of a game.
So thirty years it is the standard for calculating normals, but as to what is climate?
Should it not be accepted that the timeframe is indeed arbitrary, and climate is a function of weather and resulting enviromental change(regional or global)against time
I think defining ‘climate’ obviously frames the discussion allowing the climate of the 20th or 21st century to be considered seperately or together.
1. The thirty year moving average moves too slowly. It is harmfully inadequate in a rapidly changing climate.
2. You need “climate system”, a separate concept. The climate system includes things like total insolation, earth’s albedo, total energy in the environment including oceans, arrangement of the continents, various incompletely known biological factors,polar cold reservoirs and so forth. Along with the laws of physics and chemistry, the climate system is the cause of weather.
3. Then read Initial value vs. boundary value problems i.e. weather vs climate.
4. Then read this paper “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Find it online and learn that the current models still say “Sorry, bad luck.” http://pakistanisforpeace.wordpress.com/2011/09/16/pakistans-floods-deja-vu-all-over-again/
Comment by Pete Dunkelberg — 17 Sep 2011 @ 6:00 AM
I believe the question was why 30 years… This means that if you are using statistics to define or analyze the data set you need to apply the rule sets associated with the tool. In the absence of a random sample of a large population, in this case the application s an examination of variation in a series of data points. If the intent is to do a year over year analysis, you would build 365 tables and plug in the daily temperature values to determine if a trend is indicated in the data.
If the intent was to indicate one of the degrees of freedom, (ie: inertia to warm or cool), normally you would take the daily low and high of the same day, (to measure the warming trend); on the other hand if you wanted to determine a trend in the loss if heat normally you would take the daily high and the next mornings low to determine the loss of the daily warmth.
If all other things were stable that gives you an indication of change. If however, you have a change in the specific humidity this can complicate the analysis. Hence, why many factors need to be associated with any analysis, (ie: cloud density – height, barometric pressure, …). Together these changes help to provide a clear picture of the weather which is predicted by the expectations for the climate for the locality you are analyzing.
I will stop here as many here are already aware of these issues. It is just when we have new participants asking about climate factor analysis a brief explaniation helps.
Just a note. In response to “how do we then define climate?” you provided a link to the NCDC ‘Normals’ page for the US. They are more for weather shows on TV than for climate use. In fact, on item 10 there they write, “Normals were not designed to be metrics of climate change,” and continue to write elsewhere, for example, about the fact that a spline fit was used in the prior decade’s product for developing daily values while the new product will use daily measurements, instead. It would be hard to derive much on such shifting sands and a product not designed for climate work at the outset. They recommend one possible alternative, the USHCN series.
I forgot to mention that a decadal time gate is not a requisit. Meaning you can apply a statistical series analysis to any sequential data set containing a min. of 30 data points. (The requirement for the 30 data points is in the absence of a statistical median value it requires roughly 30% of a population to have a level of confidence that the resulting median is the statistical mid point of the total populations possible values.) With an approximate mid-point you can begin to then use values of probability wth a data set as small as 10 points or by appling the rule of squares to even smaller data points. However, in both cases your conclusions can only apply to the data set population.
A non-series random analysis is best for defining a populations mid-point, then you can analyze any series against the populations mid-point. However, again this does not define climate, only the change in relation to the data set. Hence you could say that the last five years are warmer then the prior 25 years, on average and predict the probability that the future values may be similar, providing all other variations contributing to the data were stable.
At issue with examining only one variable is there can be complications. Hence, by it’s self one weather variable does not a climate make. Realclimate and the experts here are well aware of this and I believe bending over backwards to insure you have the opportunity to explore this. So let us continue this discussion within the perimeters of weather as was the intent of this thread. (My apologies to the mods if I have gone too far.)
The “30 years” is a rough overall statement. The actual number of years (for annual observations) needed to say a trend is likely there depends on how variable the data is year to year. This is explained — at a high school level, with actual numbers that you can work on yourself — over at Robert Grumbine’s site. There are a lot of opinions, many people have several.
From his “Results” thread, this should encourage anyone interested in learning how this is decided — for any particular data set — and why there’s no one general answer, and why “30 years” is just a general idea.
“… In brief (in a journal paper, this would be the ‘abstract’):
You need 20-30 years of data to define a climate trend in global mean temperature
Forward and backward trends are markedly different
Therefore, to discuss climate trends in global mean temperature, you need to use 20-30 years of data centered on the date of interest.
As with any abstract, it’s too brief to show you why any of these are true, just some simple declarations. Now, if you trust me absolutely (which I don’t recommend — and if I’m talking science, you don’t need to), you can stop and move on to some other reading. But let’s take a look at the whys. As before, I’m putting the data and programs on my personal web site and you can run the analysis yourself, and modify the programs to work on different assumptions, methods, data sets.
Let’s consider the first point — how long it takes to determine a climate trend in global mean temperature. …”
Still hoping someone will engage what Nature’s editors are advocating: that climate scientists begin formally and publicly trying to link extreme weather events, where possible, to climate change. (See 7 above.) Is that a good idea? Thanks.
Comment by Steven T. Corneliussen — 17 Sep 2011 @ 8:54 PM
Eli Rabett @26
Climate is the jar of marbles out of which weather is picked
If I’m not too depressed I will be tracking down top UK politicians (and other relevant people) after their summer breaks to get some sort of message across. It’s depressing work on a limited budget of time, energy and money.
So I go up and say to my target of the day and say “Climate is the jar of marbles out of which weather is picked”. So the government minister says “Take a card from my departmental assistant. Write to us. We definitely want to know more.”
P.S. Anybody want to comment that the “energy” in my budget is not simply measured in joules?
Excuse this private message but if Chris Huhne, the UK Seceretary of State for Energy and Climate Change, is reading this – and he told me he does read RealClimate – then it’s about time he replied to the notes following our previous meetings. See The Department of Energy Security, http://www.brusselsblog.co.uk/?p=306
Thanks, everyone, for informative and thought-provoking replies.
Pete Dunkelberg got to the crux of my concern when he said, “The thirty year moving average moves too slowly. It is harmfully inadequate in a rapidly changing climate.”
Eli Rabett gave me something I’m basically kicking against, “Climate is the jar of marbles out of which weather is picked.” It’s another way of saying “Climate is what you expect, weather is what you get.” Neither of them are true any more when climate is changing too fast for a thirty-year average to be usefully predictive.
Others of you gave me technical reasons why thirty years is the shortest statistically reliable time frame. They were the bits of the puzzle I didn’t have.
I’m the layman round here (though not quite a newbie – I’ve been following RealClimate pretty consistently for a few years) so I don’t have to deal with the problem which, I think, is still lurking there: what to do when the thirty-year averages *hide* the current state of affairs, rather than showing it, because the first fifteen of those years are so far from the ‘normal’ of the last fifteen?
@29: “what to do when the thirty-year averages *hide* the current state of affairs, rather than showing it, because the first fifteen of those years are so far from the ‘normal’ of the last fifteen?”
Use charts like Figure 2 in “The Unnoticed Melt.” That one clearly shows, with simple clarity, the internal trend in a 30 year period. So perhaps still use the 30 year periods but start displaying them with highlighted min/max values to gain a bit of perspective beyond the averages.
@6, Mark J. Fiore, re 1000 ppm, it’s hard to see how we can get to 1000 ppm when one considers the fossil fuel reserves and more importantly, the extraction rates the reserves can support. Conventional oil and gas will be well past their peaks within a decade or two, conventional coal no more than a few decades behind. Whilst unconventionals do contain a lot of carbon, there’s no evidence currently that those resources can support the fast production rates we’re used to at a price we can afford.
I’d say it’s more likely for fossil fuel production rates (and the associated CO2 emissions) to peak within a few decades and for concentrations to top out in the 5-600 ppm range.
I wonder if anyone has ‘composed’ ENSO? Then we could subtract Vivaldi and the ‘ENSO composer’ to reveal the trend. . . wait, no, we’d a ‘volcanic’ composer. And I’m leaving that aside, or puns will ensue.
Thanks Jon Kirwan, your pointer was better than mine, and explained better.
(“… In response to “how do we then define climate?” … the NCDC ‘Normals’ … are more for weather shows on TV … “Normals were not designed to be metrics of climate change,”…. They recommend one possible alternative, the USHCN series.)
Kevin, Vivaldi is what is constant, dependable, predictable; why he speaks to us now as three centuries ago. What trees, flowers, birds have through the ages learned to expect. The canvas of our sagas, the tuning fork of life. All the met offices of the world cannot predict the weather ten days from now; but little Antonio in kindergarten tells you what coming winter will be like, and how then spring will come, and then summer, as it always has.
Seen at farmers’ market – a stall selling sheep meat under the banner “local and sustainable”. Sheep are enormous contributors to climate change, worse than cattle (http://nobeef.org.uk). This shows how easy it is to bend the meaning of words. Let’s give up the word “sustainable”. It has been debased.
Watch out for “climate” and “weather”. Definitions won’t help. Usage could.
I don’t know if this helps but when using statistics in process control, a sample size of thirty is usually taken as a minimum requirement. The main reason behind this seems to be that when the number in the sample approaches 30 most distributions, t and binomial in particular, approximate to the normal distribution. Beyond that it’s really little more than a rule of thumb.
Do not confuse added carbon with recycled carbon in the current biosphere. It matters little if the carbon decays in the soil or is rapidly converted to CO2 and methane. The resulant is similar, it is just the rate of conversion is slightly accellerated. Going on about ruminants or bio-mass conversion does little wrt GW. However, as to bio-mass or char mixed into the soil it offers a high amount of phosphates and helps in the retention of soil moisture. Both of which enhance plant growth.
Last time I did the math, I found that all known coal, petroleum, natural gas, tar sands and oils shale reserves added up to an increase to about 1300 ppmv, assuming about the same proportion winds up in the oceans.
We probably have a short term climate problem. We should use every tool in the box to slow global warming to inhibit triggering of serious climate feedbacks. We must worry about this short term, hoping that something will turn up for the medium term. It may be the only hope we’ve got.
Turning grass into methane via sheep and cattle gives a tremendous boost to warming. Over twenty years methane has over one hundred times the Global Warming Potential of carbon dioxide. (See “Beef’s footprint 25 times its own weight”, http://www.nobeef.co.uk/wordpress/?p=31 – Sheep meat is worse than beef.) Ok, the methane degrades to carbon dioxide, which may be re-absorbed as soil carbon but in the intervening years the Earth has warmed, possibly triggering feedbacks.
You mention turning grass into biochar. This can fix carbon for hundreds or thousands of years.
Instead of farming sheep we should farm biochar and foods less damaging than sheep.
For Chris Vernon @33, two words for you:
as in those in permafrost and in the shallow shelves of the Arctic Ocean.
If their melt begins to accelerate at some point between 390 ppmv and your hypothetical ‘ceiling’ of 500-600 ppmv, then 1000 ppmv becomes possible regardless of fossil fuel production.
Sad to say everything decays even dead grass and leaves. The sugars in the cellulose are attacked by insects, worms, fungi, molds, and bacteria with a resultant of CO2 and methane. The normal decay in the temperate zone may run into a few months. Animal spoor usually breaks down in 1/3rd the time. As to gases generated. the breakdown by lowers species is likely greater only not as concentrated.
As to bio-char, it depends, get it wet and and it likely will breakdown over about a year. (With bio-char the sugars are carmelized by the heat, as well as oils and resins released along with methanol (methane alcohol if you will), the remains are generally, silica, phosphates, sulfites, magnesium…, in short, usually taken up rapidly by plant and microbe life, if left on the surface. Turn it under about a foot and it is slower, by a factor of 10, before it would be processed, due to a lack of oxygenation/reduction and water erosion. However, it usually will breakdown in well irrigated/rainy regions in a few years. Take the same sample and bury it at 10 feet and it may remain for several decades, except for regions with the water tables less then 10 feet.
Point is how you process and encapsulate the dead vegetation, determines the effectiveness of sequestration. (IE: Pump treated affluent into lined and tailed out former mines and you may achieve several millenia of sequestration…)
Increasing growing bio-mass, both plants and animal, does nearly the same thing, just that the turnover is greater. With things with a multi-decadel life you can remove a large portion of carbon from the atmosphere.
Geoff, I hear what you say, but I want to leap to the defense of sheep for moment. If your cattle is grain-fed, then you can get far more food for your emissions by eating the grain directly. However, sheep thrive on hill country which would struggle to produce much food in other ways – with a useful byproduct of wool. Kangaroos or rabbits might be better but some unsolved problems in either farming or economic processing.
Phil @ 49, you’re right about sheep but the same goes for cattle in some kinds of country – northern Australia, for instance, where grazing is successful but cropping impossible. Ditto goats in the Middle East, for that matter.
The annual cycle of ice extent is typically within days if not hours of all recent years, with some excursions in absolute extent at the very top and bottom of the curve. One would expect that with the rather linear growth of GHG over time there would be a similar linear drift in extent. Can you help us lay persons understand why it is the ice extent record is rather more convulsive than the GHG growth record?
Your point is what…? As I stated, the recycle period is dependent on treatment or processing and encapsulation. The difference between fast combustion or pyrolozation/gasfication and a bio-digester is the energy (read combusted fossil fuel) you must input to the system.
With a bio-digester or sewage treatment faciliity the system can be desgned to be self powering with the balance of the carbon and water being applied to desert soils to both incease soil moisture and carbon content for thousands of years, or reducing desertification. By the same token they can be pumped into old salt mines or even abandoned petroleum wells.
The point being how we process and treat wastes today with little modification can enhance carbon sequestration. It is the disposal process that has more to do with the cycle time. Gasification or anoxious heating does increase the decompositon time; but, at what cost? Placing carbon bearing wastes even 1 foot below the surface, removing most oxygen, does extend the period, as does placing it on a dry or arrid surface and at about 100th the cost, (pumping sludge via low pressure wind or solar driven pumps would work).
The truth is wheter bio-mass is processed via waste treatment, natural surface decay, consumption/decay or even simple diverse sequestratuon techniques it really does not matter. It really is not rocket science it is more of a choice and a change in government and individual policies/attitude.
As John Mason at UK Weather World pointed out to me roughly 4 years ago; It is not GHGs that are directly causing the warming events we are seeing, it is the influence they have on natural variation and/or weather patterns.
Sad, to say, one of the most difficult issues I still face in regards to AGW or ACC is a disagreement with the conclusive agreement that CO2 fossil fuel emissions are the primary driver of changes in natural variation. Not to say they are not a participant; but, IMHO, a partner with other processes which combined appear to be influencing the past atmospheric equalibrium to a new plateu.
Given the total carbon variations, whether, fossil emissions or bio-sequestration, to me, on the surface, between 1800 and 1950 appears to be a wash. Even during that period it was apparent that the Earth had been emerging from a cooler cycle. The problem I have is in the identification of the changes that preceeded significant fossil fuel emissions, that was influencing the emergence from the apparent LIA.
Note: This is a just a personal observation or comment and not a denial of known scientific fact.
“However, biochar production, especially in large amounts, takes a lot of energy. Burning it can also release other greenhouse gases, such as carbon monoxide and methane, into the air. At the same time it will also destroy precious plant nutrients such as nitrogen,” he says.
“Our challenge was to come up with a method that will allow us to tie carbon added through manures and composts to the soil while keeping their fertiliser value. So we tried co-composting manures and composts with compounds such as iron oxide, aluminium oxide and allophane clay, and spreading this mixture on agricultural soils.
Geoff says As to bio-char, it depends, get it wet and and it likely will breakdown over about a year. (With bio-char the sugars are carmelized by the heat, as well as oils and resins released along with methanol (methane alcohol if you will), the remains are generally, silica, phosphates, sulfites, magnesium…, in short, usually taken up rapidly by plant and microbe life, if left on the surface. Turn it under about a foot and it is slower, by a factor of 10, before it would be processed, due to a lack of oxygenation/reduction and water erosion. However, it usually will breakdown in well irrigated/rainy regions in a few years. Take the same sample and bury it at 10 feet and it may remain for several decades, except for regions with the water tables less then 10 feet.
Geoff, what you say flies in the face of everything I’ve read about biochar. If you’ve got some references I’d like to hear about them.
What the rest of the world says about biochar is:
1) It’s a combination of elemental carbon with variable pore size and ash.
2) Nutrients in the ash are largely water soluble
3) The carbon is fixed for an indefinite period; soil organisms don’t consume it
4) The char helps retain nutrients. Of particular interest is the fixing of nitrogen compounds
The summary is that charring crop residue annually should fix carbon for the long term, raise soil carbon levels, and lower fertilizer requirements.
Taken as a whole, biochar offers one of few ways of reducing atmospheric carbon and increasing agricultural output.
Sorry Geoff was not responsible for the quote, I was. Before we go terribly far down the wrong road you really must understand organic or bio-mass pyrolozation. First, you may want to understand the difference between fast and slow. Second, you may want to understand the difference between wet and dry. Third you may want to research the dfferences between the various temperature bands as the relate to the output, ie: syngas, bio-oils, charcoal. The issue being the amount of energy you put in for the desired output. I have seen estimates that up to 15% of the potential reserve is required to convert the raw materials to a useable product. Experience has shown that up to 85% is required in the case of wet bio-mass. The less input the more that can be used elsewhere.
Bacterium or fungi, and molds do not usually reduce biochar as it is basically sanitized. As to processing of it, at each temperature band it undergoes a level of transformation, understanding what happens in each band is critical in defining what is contained in the various byproducts. Feedstock, heat, water, processing can result in a number of useful items of which biochar is but one. Place pyrolized bio-mass away from water and it can last as a form of charcoal for thousands of years, yet requires the greatest amount of energy to get there. If we can use low energy to provide a useful product the better off we would be in the scheme of things.
Sorry, biochar is not rocket science, it is simply a byproduct of pyrolysis as was employed in the production of charcoal. Most of those who had taken a 8th grade course in physics and ecology would have at least run the experiment of wood strips reduced in a test tube and the collection via ice bath collection points and would be aware of the process and the residuals.
The point being that biochar or charcoal production has set stages of transformation related to the conversion temperature applied. The greater the energy levels the higher the isolation of pure carbon which runs between 85-95%. The rest is genetaly “tars” or a combination of carmelized sugars, incomplete converted resins, potash (potasium) and other common chemicals that precipitate out of biomass.
Being that the science is terribly old I doubt you will find many papers on the subject, it would be like trying to find a citation in regards to the photonic-electron energy state in relation to CO2s exposure to various em exposure. Unless it is a new twist it will like be best supported from quotes of textbooks.
In Science Daily there is a reference to an earthworm experiment, apparently someone forgot to balance the soil moisture if the intent was to integrate it into the soil. The other reference was to the use of charcoal or biochar to reduce the impact of bio-nitrate concentrations (Apparently in preperation for the introduction of a new pet product related to how to have a fenced in dog and a healthy lawn…)
As to the last line in the quote there is a mistake, I was certain that I had typed if you buried the biochar at 10 feet it would likely remain undissolved for several milleniums and not decades. Much of the issue as regards charcoals integration is the breakdown of the cellulose structure in bio-mass, unless oxidized. As to other forms of bio-mass, such as partially digested, either as manure or paper or kitchen wastes, the liquifaction is likely the better application and the extraction of bio-oils via wet pyrolysis. The use of dry pyrolysis is generally well applied to industry or farm-field dried crop wastes, where it can be returned to the soil.
The problem with charcoal is trying to wet it, unless emulsified in the presence of a detergent (as with oil) the tendency is for charcoal not to go into solution or absorb water. Over hundreds of years of natural sequestration or mixed in with boggy/wet organics the charcoal will eventually get wetted and begin to be reduced by the dissolved oxygen in the water. Then the controlling factors are wetted area and dissolved O2 concentrations, (boggy waters tend to be anoxic and clay soils tend to be drying). As to sources, not opinion, just empirical…
The ‘Ask for Evidence’ campaign has been launched by Sense About Science, a charity that aims to help people make sense of science in the public domain. The charity sets out to respond to the misrepresentation of science and scientific evidence on issues that matter to society, and to work with scientists and civic groups to increase understanding of the insights of scientific reasoning.
The public domain is awash with scientific and medical information in advertisements, product websites, advice columns, campaign statements, celebrity fads and policy announcements. Even where there is regulation, claims that are not based on good evidence keep reappearing. Sense About Science believes that the only way to address this is to enable the public to ask questions about the evidence behind such claims for themselves….”
Ask for evidence launch
Why we should ask for evidence …
… Confident assertions are often made in the evidence-free zone ….
… Science allows us to evaluate claims – to sort the wheat from the chaff. The more we question and rely on evidence, the more we will know, the better informed we will be and the better our decisions are likely to be….
… “There is much misleading information on the web ….”
… “You should always be asking ‘where is the evidence?’”
… “difficult to distinguish personal opinion from scientific evidence – it is important to search out the underpinning evidence.”
“… next time somebody tells you that something is true, why not say to them: ‘What kind of evidence is there for that?’ And if they can’t give you a good answer, I hope you’ll think very carefully before you believe a word they say.”
— end excerpts —