There has been a bit of excitement and confusion this week about a new paper in Nature Geoscience, claiming that we can still limit global warming to below 1.5 °C above preindustrial temperatures, whilst emitting another ~800 Gigatons of carbon dioxide. That’s much more than previously thought, so how come? And while that sounds like very welcome good news, is it true? Here’s the key points.
Emissions budgets – a very useful concept
First of all – what the heck is an “emissions budget” for CO2? Behind this concept is the fact that the amount of global warming that is reached before temperatures stabilise depends (to good approximation) on the cumulative emissions of CO2, i.e. the grand total that humanity has emitted. That is because any additional amount of CO2 in the atmosphere will remain there for a very long time (to the extent that our emissions this century will like prevent the next Ice Age due to begin 50 000 years from now). That is quite different from many atmospheric pollutants that we are used to, for example smog. When you put filters on dirty power stations, the smog will disappear. When you do this ten years later, you just have to stand the smog for a further ten years before it goes away. Not so with CO2 and global warming. If you keep emitting CO2 for another ten years, CO2 levels in the atmosphere will increase further for another ten years, and then stay higher for centuries to come. Limiting global warming to a given level (like 1.5 °C) will require more and more rapid (and thus costly) emissions reductions with every year of delay, and simply become unattainable at some point.
It’s like having a limited amount of cake. If we eat it all in the morning, we won’t have any left in the afternoon. The debate about the size of the emissions budget is like a debate about how much cake we have left, and how long we can keep eating cake before it’s gone. Thus, the concept of an emissions budget is very useful to get the message across that the amount of CO2 that we can still emit in total (not per year) is limited if we want to stabilise global temperature at a given level, so any delay in reducing emissions can be detrimental – especially if we cross tipping points in the climate system, e.g trigger the complete loss of the Greenland Ice Sheet. Understanding this fact is critical, even if the exact size of the budget is not known.
But of course the question arises: how large is this budget? There is not one simple answer to this, because it depends on the choice of warming limit, on what happens with climate drivers other than CO2 (other greenhouse gases, aerosols), and (given there’s uncertainties) on the probability with which you want to stay below the chosen warming limit. Hence, depending on assumptions made, different groups of scientists will estimate different budget sizes.
Computing the budget
The standard approach to computing the remaining carbon budget is:
(1) Take a bunch of climate and carbon cycle models, start them from preindustrial conditions and find out after what amount of cumulative CO2 emissions they reach 1.5 °C (or 2 °C, or whatever limit you want).
(2) Estimate from historic fossil fuel use and deforestation data how much humanity has already emitted.
The difference between those two numbers is our remaining budget. But there are some problems with this. The first is that you’re taking the difference between two large and uncertain numbers, which is not a very robust approach. Millar et al. fixed this problem by starting the budget calculation in 2015, to directly determine the remaining budget up to 1.5 °C. This is good – in fact I suggested doing just that to my colleague Malte Meinshausen back in March. Two further problems will become apparent below, when we discuss the results of Millar et al.
So what did Millar and colleagues do?
A lot of people were asking this, since actually it was difficult to see right away why they got such a surprisingly large emissions budget for 1.5 °C. And indeed there is not one simple catch-all explanation. Several assumptions combined made the budget so big.
The temperature in 2015
To compute a budget from 2015 to “1.5 °C above preindustrial”, you first need to know at what temperature level above preindustrial 2015 was. And you have to remove short-term variability, because the Paris target applies to mean climate. Millar et al. concluded that 2015 was 0.93 °C above preindustrial. That’s a first point of criticism, because this estimate (as Millar confirmed to me by email) is entirely based on the Hadley Center temperature data, which notoriously have a huge data gap in the Arctic. (Here at RealClimate we were actually the first to discuss this problem, back in 2008.) As the Arctic has warmed far more than the global mean, this leads to an underestimate of global warming up to 2015, by 0.06 °C when compared to the Cowtan&Way data or by 0.17 °C when compared to the Berkeley Earth data, as Zeke Hausfather shows in detail over at Carbon Brief.
Figure: Difference between modeled and observed warming in 2015, with respect to the 1861-1880 average. Observational data has had short-term variability removed per the Otto et al 2015 approach used in the Millar et al 2017. Both RCP4.5 CMIP5 multimodel mean surface air temperatures (via KNMI) and blended surface air/ocean temperatures (via Cowtan et al 2015) are shown – the latter provide the proper “apples-to-apples” comparison. Chart by Carbon Brief.
As a matter of fact, as Hausfather shows in a second graph, HadCRUT4 is the outlier data set here, and given the Arctic data gap we’re pretty sure it is not the best data set. So, while the large budget of Millar et al. is based on the idea that we have 0.6 °C to go until 1.5 °C, if you believe (with good reason) that the Berkeley data are more accurate we only have 0.4 °C to go. That immediately cuts the budget of Millar et al. from 242 GtC to 152 GtC (their Table 2). [A note on units: you need to always check whether budgets are given in billion tons of carbon (GtC) or billion tons of carbon dioxide. 1 GtC = 3.7 GtCO2, so those 242 GtC are the same as 887 GtCO2.] Gavin managed to make this point in a tweet:
Headline claim from carbon budget paper that warming is 0.9ºC from pre-I is unsupported. Using globally complete estimates ~1.2ºC (in 2015) pic.twitter.com/B4iImGzeDE
— Gavin Schmidt (@ClimateOfGavin) September 20, 2017
Add to that the question of what years define the “preindustrial” baseline. Millar et al. use the period 1861-80. For example, Mike has argued that the period AD 1400-1800 would be a more appropriate preindustrial baseline (Schurer et al. 2017). That would add 0.2 °C to the anthropogenic warming that has already occurred, leaving us with just 0.2 °C and almost no budget to go until 1.5 °C. So in summary, the assumption by Millar et al. that we still have 0.6 °C to go up to 1.5 °C is at the extreme high end of how you might estimate that remaining temperature leeway, and that is one key reason why their budget is large. The second main reason follows.
To exceed or to avoid…
Here is another problem with the budget calculation: the model scenarios used for this actually exceed 1.5 °C warming. And the 1.5 °C budget is taken as the amount emitted by the time when the 1.5 °C line is crossed. Now if you stop emitting immediately at this point, of course global temperature will rise further. From sheer thermal inertia of the oceans, but also because if you close down all coal power stations etc., aerosol pollution in the atmosphere, which has a sizeable cooling effect, will go way down, while CO2 stays high. So with this kind of scenario you will not limit global warming to 1.5 °C. This is called a “threshold exceedance budget” or TEB – Glen Peters has a nice explainer on that (see his Fig. 3). All the headline budget numbers of Millar et al., shown in their Tables 1 and 2, are TEBs. What we need to know, though, is “threshold avoidance budgets”, or TAB, if we want to stay below 1.5 °C.
Millar et al also used a second method to compute budgets, shown in their Figure 3. However, as Millar told me in an email, these “simple model budgets are neither TEBs nor TABs (the 66 percentile line clearly exceeds 1.5 °C in Figure 3a), they are instead net budgets between the start of 2015 and the end of 2099.” What they are is budgets that cause temperature to exceed 1.5 °C in mid-century, but then global temperature goes back down to 1.5 °C in the year 2100!
In summary, both approaches used by Millar compute budgets that do not actually keep global warming to 1.5 °C.
How some media (usual suspects in fact) misreported
We’ve seen a bizarre (well, if you know the climate denialist scene, not so bizarre) misreporting about Millar et al., focusing on the claim that climate models have supposedly overestimated global warming. Carbon Brief and Climate Feedback both have good pieces up debunking this claim, so I won’t delve into it much. Let me just mention one key aspect that has been misunderstood. Millar et al. wrote the confusing sentence: “in the mean CMIP5 response cumulative emissions do not reach 545GtC until after 2020, by which time the CMIP5 ensemble-mean human-induced warming is over 0.3 °C warmer than the central estimate for human-induced warming to 2015”. As has been noted by others, this is comparing model temperatures after 2020 to an observation-based temperature in 2015, and of course the latter is lower – partly because it is based on HadCRUT4 data as discussed above, but equally so because of comparing different points in time. This is because it refers to the point when 545 GtC is reached. But the standard CMIP5 climate models used here are not actually driven by emissions at all, but by atmospheric CO2 concentrations. For the historic period, these are taken from observed data. So the fact that 545 GtC are reached too late doesn’t even refer to the usual climate model scenarios. It refers to estimates of emissions by carbon cycle models, which are run in an attempt to derive the emissions that would have led to the observed time evolution of CO2 concentration.
Does it all matter?
We still live in a world on a path to 3 or 4 °C global warming, waiting to finally turn the tide of rising emissions. At this point, debating whether we have 0.2 °C more or less to go until we reach 1.5 °C is an academic discussion at best, a distraction at worst. The big issue is that we need to see falling emissions globally very very soon if we even want to stay well below 2 °C. That was agreed as the weaker goal in Paris in a consensus by 195 nations. It is high time that everyone backs this up with actions, not just words.
Technical p.s. A couple of less important technical points. The estimate of 0.93 °C above 1861-80 used by Millar et al. is an estimate of the human-caused warming. I don’t know whether the Paris agreement specifies to limit human-caused warming, or just warming, to 1.5 °C – but in practice it does not matter, since the human-caused warming component is almost exactly 100 % of the observed warming. Using the same procedure as Millar yields 0.94 °C for total observed climate warming by 2015, according to Hausfather.
However, updating the statistical model used to derive the 0.93 °C anthropogenic warming to include data up to 2016 gives an anthropogenic warming of 0.96 °C in 2015.
Statement by Millar and coauthors pushing back against media misreporting. Quote: “We find that, to likely meet the Paris goal, emission reductions would need to begin immediately and reach zero in less than 40 years’ time.”
118 Responses to "Is there really still a chance for staying below 1.5 °C global warming?"
Steven Mosher says
if there was an upvote button, I would upvote
Jai Mitchell says
It is also very crucial to include the most definitive estimates of additional carbon cycle feedbacks that have already been locked in due to current (and future) warming. Crowther 2016 showed that the midpoint projections of warming soils CO2 emissions under a 2C warmer world are close to 300 GtC within 50 years with considerable more afterwards. At a 3C world extrapolation of his results shows a likely 400 GtC released over 50 years after 3C is reached.
These warming regimes will also drive rapid emissions from tropical peat, through tropical rainforest drying and through heat stresses on boreal forests. We are already witnessing these events and the CO2 emissions from the 1997-1998 El Nino impacts on Indonesian forests was equal to those produced by the entire United States during that year.
Clearly, if we want to honestly communicate our current condition, and how to avoid the worst of what is to come, we must include these feedbacks as well as utilize ‘Avoid’ budgets of carbon emissions going forward, NOT adjust how we measure things so that it ‘fits’ the policy.
In the history of human existence, we do not give up when faced with a potential threat that we understand to be present. We must be made aware of our current hazard so that we can collectively press our elected leaders for a world war II scale total societal mobilization to reduce all carbon emissions, and develop and implement atmospheric carbon sequestration through industrial and regenerative agricultural means, on a global scale!
Larry Edwards says
It is worth considering that at this point our salvation lies in putting pressure on the politics rather than in increasingly nuanced climate science discussions. As poignantly noted in RealClimate in 2009 ( “Hit the Brakes Hard”, the time is late, and even more so now. With “1-in-150” of humanity affected by flooding in just one recent month (and not counting the later impact of Hurricane Maria), the matter before us really is how hard can we hit the brakes and how fast politically can we hit them.
You ask above, “But of course the question arises: how large is this budget?” But is this really the right question? Clearly, at least in my belief, we are over-budget already. This leads me to think that we need to leave behind us the slippery-slope of carbon budgets and choices between 1.5 or 2oC — the slippery slope established in recent decades that has led us directly into now already realized danger.
We need a lively debate on whether to leave these kinds of climate science efforts behind, and focus instead on socio-political measures to “hit the brakes hard” — whether those be carbon taxes, carbon rationing (probably the most direct and effective means), or whatever. Reality has eclipsed climate science — present reality is more compelling than the science has proved to be, and arguably for our salvation we must build from that instead.
My two cents. I hope we can discuss this here; my perception of reality compels me to raise this.
Larry Edwards says
With interruptions I overlooked Stefan’s closing paragraph: “Does it all matter?” Glad to see this question posed — the most important paragraph in his article, I think. It raises other questions above; what should we really be doing in the sciences (broadly, not just climate and physics) and in society, at this late point?
Thanks for those valuable explanations. Even amateur followers of the debate like me can get the essence of that.
Once again the Daily Mail gets hysterical, and jumps to all sorts of silly conclusions, doesn’t understand what it reads, and also misses the important points.
My understanding is temperatures are still a little under what climate models generally predict on the whole as pointed out even in data sets other than hadcrut. But according to this research in Nature below, the worlds oceans have absorbed more energy than expected, and effectively delayed temperatures a bit, but this process may be coming near an end. We could see a sustained increase in temperatures later this century in a sort of rebound effect.
So climate modelling may not be perfect in the timing, but the end result would still be the same level of temperatures around 2100, so therefore we don’t have any room to think emissions can continue at present levels, and the amount of carbon left to burn would not be as high as 800 gigatons.
[Response: Look at Hausfather’s graph of warming by 2015, relative to 1880-1900 baseline, in the CMIP5 model mean relative to five leading global temperature data sets. Only when compared to HadCRUT do the models overestimate the warming, against all four other data sets the models underestimate the warming. Remember the light grey bars are the proper apples-to-apples comparison. -stefan]
Alastair McDonald says
When are people going to realize that we have already passed a tipping point where the Arctic sea ice, Greenland ice sheet, and the west Antarctic ice sheet are all going to melt away. The level of CO2 is causing melting, and unless we reduce that level those ice sheets will disappear.
That means that global albedo will fall, and the will Earth warm by well above 2C.
Gavin is issuing a challenge next week:
at the Royal society meeting on hyperthermals. I can only hope it is call on all of us to accept the disastrous state we have got into by trusting in democracy.
Doesn’t it seem a bit like cherry picking to point to strong El Nino years like 2015 and 2016, say ‘oh look, these years barely bring us back to the multimodel mean’, and then use that to suggest that there is no discrepancy between models and observations? At least try to remove El Nino from the temperature trend before comparing it to the multimodel mean.
[Response: Indeed. That is what was done. -Stefan]
Barton Paul Levenson says
AM 6: I can only hope it is call on all of us to accept the disastrous state we have got into by trusting in democracy.
BPL: It wasn’t democracy that elected Trump. That election was pretty heavily interfered with.
Dan Miller says
#2 Jai: In addition, we should add feedbacks from permafrost and shallow clathrates in the Arctic Sea. Other likely feedbacks not included in climate models are forest diebacks and reductions in aerosols (mentioned in the post). In general, climate predictions are “best case” analyses because climate scientists (actually all scientists) do not include things that they are not sure about. This is fine for most science, but not for existential threats. Imagine if the military fought wars this way!
#3 Larry: The best way to rapidly reduce emissions is a Fee and Dividend carbon policy where a rising price on carbon is paid by fossil fuel companies and 100% of the revenues collected is given back to every legal resident. Its the only pricing policy that will get the fee to $100/ton and beyond. At $100/ton, Direct Air Capture becomes viable commercially. See my TEDx for more:
David Appell says
“If you keep emitting CO2 for another ten years, CO2 levels in the atmosphere will increase further for another ten years.”
Is this a general rule of thumb? Is it true that if you keep emitting CO2 for another X years, CO2 levels in the atmosphere will increase further for another X years, for any X?
[Response: No – and I think you know it. That was short-hand for saying that if you delay emissions reductions by ten years, you not only delay solving the climate problem by ten years but you inevitably end up with a higher CO2 level that won’t go away. -Stefan]
Kevin McKinney says
#7, -1=e^ipi says:
Oddly enough, Stefan, the author of this post, has considerable experience with that:
Mr. Know It All says
HO LEE COW!
0.93 C, 0.94 C, 0.96 C.
Mighty accurate models y’all are using. Amazing!
Reminds me of the guy at work who would write the dimensions on his drawings exactly as his calculator indicated – to 8 digits right of the decimal point! :)
Rudy Sovinee says
I’ll start with what hasn’t been mentioned = fuel subsidies. We must ADD IN the ending of subsidies – which promote consumption and primarily benefit the rich – details in my post
Stop the subsidies and put a cost on Carbon, applied at the well or mine (including leakages. On this, #9 Dan is right, the Fee & Dividend option is a persistent and granular option for getting people to look to reduce carbon.
When I saw the title wondered what level of capitulation was going to be promoted so as to continue this destructive addiction a little longer, digging our graves yet deeper. First, there is latent warming in the system from the CO2 present that is far from equilibrium as to balancing absorption vs radiant heat to space. Then too, #2 Jai is right. There are already visible signs of amplifying feed-backs.
Overall, this paper in Geoscience is a dangerous distraction of calculating how close to the cliff we (might) still be able to go.
8 BPL, you will hear me cheering from all the way across the pacific the day you become far less myopic, not so narrowly constrained and basically far less biased and more open-minded about reality.
America’s habit of blaming everyone else in the world for their own failures and the eternal internalised rationalising away of the high degree of entrenched cultural paranoia is simply unsustainable these days. In my opinion.
Instead, humanity really needs the more intelligent science orientated ethical people in America to realise that the USA is in many respects so utterly disconnected from the real world they may as well be living on Pluto sometimes. :-)
9 Dan Miller, now 3 years later in 2017, how’s that working out for you now? What’s changed? :-)
BPL said interfered with. I assume you mean Russians?
Bull, uf so.
The election was Trump’s as soon as Bernie got railroaded out of the nomination.
Even if there was Russian messing around, it wouldn’t have beaten Bernie’s double-digit margin.
Americans are not very bright these days.
Russell Seitz says
Which decade is betting on to justify his view ?
Sapient Fridge says
“you need to always check whether budgets are given in billion tons of carbon (GtC) on billion tons of carbon dioxide”
Typo of “or”? Looks like it should be “or billion tons of carbon dioxide”?
[Response: Thanks, fixed.]
Adam Lea says
Regarding the possibility that human emission-related warming will prevent the next ice age, I read somewhere that technically we are still in an ice age, the inter-glacial part of it, and that an ice age is defined as when the Earth has permenent snow and ice at the poles. Is this correct?
[Response: That is a question about words – I’m not even sure how “ice age” is usualy defined in English. In German we distinguis “Eiszeitalter” and “Eiszeit”, the former consisting of a whole sequence of glacials and interglacials whereas the latter is a glacial. What I was saying, technically, is that the Holocene interglacial would normally last another 50,000 years until the next glacial sets in, based on the Milankovich cycles which cause these glacial cycles. But in 50,000 years CO2 levels may well still be high enough, after our fossil fuel gluttony, that glaciation is prevented. -Stefan]
Mark Roest says
We need to do a ‘full court press’ to switch from fossil fuels and nuclear (which also releases heat into water on a large scale, and supports nuclear military programs), using existing and near-commercialization battery, solar and wind technology, and design systems to cost-effectively convert the existing 600-million-plus vehicle fleet rather than waiting for replacement.
To minimize the continued use of fossil fuels, we also need to find another source of chemical feed-stocks. A good one is mechanically thinning excess fuel in forests that were subjected to fire suppression for 100 years of bad US Forest Service policy, and chipping non-lumber-quality wood to feed into a pyrolysis refinery that captures the vapors and makes a liquid fuel / feed-stock plus biochar. Bury the biochar in the garden and increase it’s resilience to drought and nutrient shortfalls for perhaps a thousand years, as it also sequesters carbon for that length of time. All Power Labs in Berkeley, CA makes a pallet-size refinery (20kW if used as fuel to power an included engine and generator) and one in a 20-foot container (100 kW). It’s well-sealed, and cost-effective enough that with old pricing and free fuel, they said it was just 2 cents a kWh levelized cost.
Ed Davies says
I think it’s always worth putting these sorts of numbers (billions of things for the world/country/etc) in a per capita context for better understanding.
887 GtCO₂ is the emissions from burning 285.6 Gt of kerosene (e.g., aviation fuel, home heating oil or the like, approximating that as C₁₂H₂₆ and not dramatically different for diesel or petrol/gasoline). Sharing equitably amongst the approx 7 Gpeeps currently on the planet that’s about 40 tonnes each. For everything. Food production, heating, transport, wars… For the rest of your life. And your share of your descendants’ lives, ya unto the seventh generation.
So maybe even if the budget available does turn out to be as high as this paper indicates people need to think carefully about what to do with their shares.
Robert W says
Dear prof. Rahmstorf,
Excellent writing, as usual. I have one question; you write that using Berkeley Earth data reduces the budget by ~40% which is significant. But regarding their budget definition, if instead the TAB-definition would be used, can you estimate how much their budget would be reduced then?
When it comes to the budget definitions, my understanding is that they of course should be aligned to be able to do a proper comparison. Is that right?
Keep up the good work!
[Response: I can’t estimate that; it is complicated and depends e.g. on what happens with aerosol load of the atmosphere when you reduce CO2 emissions, how the carbon cycle responds, ocean thermal inertia. You really need to run the models with scenarios that actually stay below 1.5 °C and then diagnose the emissions. I’m sure papers that do that properly will appear. -Stefan]
Roger Albin says
Excellent post – thank you.
Alastair McDonald – We’re in this situation partly because of insufficient democracy in the USA. What if the US was an ordinary majoritarian democracy instead of having an electoral college system that skews voting impacts? Al Gore instead of Bush II; Hilary Clinton instead of Trump. Not to mention the ways in which the wealthy exercise outsize impact on American elections.
Jai Mitchell says
Permafrost emissions are included in the Crowther 2016 analysis and current studies do not indicate a major clathrate release at temperatures below +4.0C above pre-industrial on any major level. After that the result is “unknown”. Forest dieback is a major result of a potential shift to a perpetually positive IPO which is represented in some models and the methodology of this re-estimate of carbon budgets relies on recent cooling that was caused by shifts in SO2 emissions from the western hemisphere to the eastern in the 2000’s. see these two papers: http://iopscience.iop.org/article/10.1088/1748-9326/aa5dd8 and (especially!) http://www.nature.com/nclimate/journal/v6/n10/full/nclimate3058.html
Thank you, Stefan. Pretty concise and clear. I appreciate the comments above about the various tipping points that are already getting pushed. Albedo, clathrates, CO2 emissions from warmed permafrost, etc. Stefan hits the nail on the head with “does it matter?” It does not. We have a cliff ahead, we need to hit the brakes hard. The growthers all seem intent on making sure we use only enough brakes to stop with our toes over the precipice. This seems prudent to growthers. I wonder how many growthers have actually walked up to a precipice and stopped with their toes over the edge?
Hit the brakes. Hit the brakes hard. Now.
September 10 – 16, 2017 403.70 ppm
1 Year Ago
September 10 – 16, 2016 401.17 ppm
10 Years Ago
September 10 – 16, 2005 380.59 ppm
Res ipsa loquitor.
Jai Mitchell says
Technical P.S. to my #10 above and the P.S. in the article:
WRT human contribution to observed warming circa 2014 Gavin Schmidt indicates the human contribution is likely 110% of observed due to the cooling effect of anthropogenic SO2 emissions (and other). see: https://www.realclimate.org/index.php/archives/2014/08/ipcc-attribution-statements-redux-a-response-to-judith-curry/
I note that the Miller paper adopts some methodologies used in the climate denier sphere to base projections on ECS using past observed temperatures (this was very common during the faux ‘pause’).
For David Appell. Humans currently produce 10 GtC (30 GtCO2) per year, while ocean and terrestrial processes remove about 5 GtC. If we stabilize our CO2 output now, we would get roughly 2 ppmV increase in CO2 for each year that we continue emitting. In order for atmospheric CO2 level to stabilize, we need to cut emissions roughly in half.
[Response: Initially, yes. But then the CO2 sinks start to saturate, they stop taking up so much, and you need to cut the emissions further. -Stefan]
...and Then There's Physics says
I’m not even sure that that is correct. To stabilise emissions would require continually decreasing emissions. If we simply cut emissions in half, atmospheric concentrations will continue to rise (although, an instantaneous cut in half may cause them to drop initially).
...and Then There's Physics says
Correction to the above comment – “to stabilise *concentrations* would require continually decreasing emissions”.
Mark Goldes says
Breakthrough energy technology, can replace fossil fuels fast. 24/7 fuel-free engines will soon power generators of all sizes that need no fuel. A converted Ford engine proved the concept. A second engine has been converted and will be ready for validation by an independent laboratory.
Inexpensive engines are being designed. They can be made of polymers (plastics) since there is no combustion.
The science expands the Second Law of Thermodynamics. Therefore, few believe such engines are possible. A White Paper is available. The work reflects 27 years of effort. See aesopinstitute.org to learn more.
Imagine power generation at every scale operating 24/7 without fuel – homes, buildings & industry.
Cars, trucks, boats, ships and aircraft, having unlimited range, with no need for fuel.
Similar engines will self-power refrigeration, heating and air-conditioning.
Trolls are certain such work reflects fraud and dishonesty, making efforts to find urgently needed small financial support a nightmare. A bold soul can accelerate this potentially life saving effort.
Substantial capital for commercialization world-wide is pending.
Political realities indicate only unusually rapid development of this (and parallel) revolutionary science and technology can improve the odds for human survival on this dangerously warming planet.
Action now can greatly speed this urgent effort!
Jonathan Richards says
Scientist love to discuss how many angels can dance on the head of a pin. They can compute the brand of pin, circumference, density and even the size of the angels. But what they can’t seem to know is the entire discussion is irrelevant from the onset.
What they’re doing may seem important, but it only adds fuel to the fire. And waste increasingly valuable (irreplaceable) time.
We’re in for the fight of our lives (really) and that is the message and urgency that needs to be conveyed. All this useless discussion about “budget” simply enables more of the very same destructive practices (burning carbon) as before.
It also entirely misses the chaos and suffering and destruction already caused by climate change – with more to come. In effect, the “budget” arguments tend to reduce the observable reality into a quasi-spreadsheet of assumptions, negating the real costs already being experienced in lives and dollars.
The entire argument is IRRELEVANT to put too-fine a point on it and always has been. It is another red-herring being used by interest groups to further another absolutely useless debate which we all know as “is climate change real” and “are humans causing this?”.
These are distractions. The reality is we do not “have a budget” and never did. To claim that we do is to say that “we can safely kill xx number of humans, and acidify the world’s oceans to xx levels” while also falsely claiming that surface temperatures can be stabilized for civilization because “we will stop emitting more carbon”.
We have not stopped emitting carbon, and probably can’t. Any “budget” is in effect, too much carbon as the evidence actually shows. We’ve already blown past the “no effect on the planetary environment or climate” as the evidence also shows. There is no “budget” and never was.
Enabling the budget arguments simply enables more delay, more inaction, more denial, more refusal since there are contradictory claims that indicate we’ve got “more time” and “more allowable” emissions. How many people need to die before these numbers finally reach parity?
Apparently, all indications are that many, many more people need to die before we can finally accept the fact that there never really was any kind of a budget. This is a irrelevant argument, and has always been a red-herring.
So quite frankly, I do not care for the “science” behind these false claims. They are based on an enormous amount of assumptions that totally fail to accept the unfolding reality we already have. It matters not how many of these angels can dance on the head of this pin and never, ever did.
Zeke Hausfather says
As Stefan mentioned, the 2015 model/obs comparisons use the Otto et al 2015 approach of isolating the forced anthropogenic and natural components from the data. You can find their spreadsheet here: https://t.co/15IS9IWdQJ
In general, comparing warming from models and obs since the mid-1800s isn’t ideal, since there is large observational uncertainty prior to 1900. Comparisons like this over particular periods of interest (e.g. the mid-2000s ‘slowdown’) tend to be a lot more informative: https://s26.postimg.org/xi5sivcp5/simple_comp_1970_2020_1970_2000_baseline.png
[Response: … where, of course, there is nothing statistically significant about this “slowdown”, not even close. It is well within the usual short-term variability, see http://iopscience.iop.org/article/10.1088/1748-9326/aa6825?fromSearchPage=true (I teamed up with two professional statisticians there, just to make sure it is absolutely sound). -Stefan]
The following is a good overview of the idea that greenhouse gases will prevent the next ice age. Detailed enough to be useful without being a long research paper, paywalled etc.
It also appears early human agriculture may have changed the CO2 profile just enough to already have had a significant effect. The main point was that we have already emitted enough greenhouse gases to prevent an ice age or at least seriously reduce its impacts.
Al Bundy says
Perhaps liquidity (financial) should be included in the analysis. CO2 is “guaranteed” to be emitted as soon as the vehicle is bought, the furnace is installed, or the power plant built. Using that metric, there’s no way in 7734 this globe is staying under 1.5C. I mean, seriously, landfilling most of our infrastructure is probably less likely than using genocide to solve the issue.
And it’s not just infrastructure. Virtual money is more important than planetary reality. They’re still fighting over who gets to add more drilling rights when ~85%* of what’s already “in the books” can’t be safely extracted.
*That 80% estimate has aged, so I fudged it up a tad.
Thomas: America’s habit of blaming everyone else in the world for their own failures and the eternal internalised rationalising away of the high degree of entrenched cultural paranoia is simply unsustainable these days. In my opinion.
AB: True. And besides, it was an inside job. When the bank employee unlocks the door and opens the vault… And the Russian stuff was minor compared to emails, lies, et al. This reminds me of denialists. (IT’S THE SUN!!!)
It also reminds me of Gandhi’s answer to, “What do you think of Western civilization?” “I think it would be a good idea.”
Al Bundy says
Dan Miller: Imagine if the military fought wars this way!
AB: That would be great!! Imagine if there was a war in, say, Vietnam, Iraq, or Afghanistan and (as the song says) nobody came. The USA’s wars are to enrich, enrage, and elect. Talk about an oxymoron – “Defense spending”.
Barton Paul Levenson says
Th 14: BPL, you will hear me cheering from all the way across the pacific the day you become far less myopic, not so narrowly constrained and basically far less biased and more open-minded about reality.
BPL: Thomas, you will hear me cheering from all the way across the internet the day you admit you’re a Chekist and announce your plans to defect to the west.
Barton Paul Levenson says
K 16: BPL said interfered with. I assume you mean Russians?
Bull, uf so.
BPL: And another Chekist heard from. Guys, unless you live in America, like I do, don’t advise me on how American elections work. I have actually worked in American elections. You, on the other hand, apparently get your information from Pravda. In all sincerity, you and Thomas can both bite me.
@31 Jonathan Richards: “Science is irrelevant!”
Right. Got it.
As Roy Spencer has taken it to heart, expect Administrator Pruit and President Trump to follow suit.
Larry Edwards says
Dan Hill (#9, in response to my #3), I am all too familiar with Fee & Dividend and did listen to your TEDx. F&D may have been worth a try 20 years ago. But at this late date, because F&D is an indirect market-based-mechanism of questionable efficacy, it fails the need for a certain and rapid reduction in CO2 emissions. Well-off people will be able to pay the higher prices, and the 50-70% of people who will gain financially (dividend $ > higher prices) will have more money to spend, likely leading to higher carbon emissions or surely not as much reduction as predicted. I have read all of CCL’s documentation, and these factors have not been taken into account in the rosy predictions. And there are other problems with F&D.
In contrast, carbon rationing is a mechanism that directly limits the sale of fuels, and can be implemented with little lead time as was done in the UK & US in WW-II. A firm schedule for rationing can be established at the outset, for an agreed carbon budget. For example, the the schedule could be for a straight line decline from present levels to zero at a predetermined date. We know how to ration successfully, from the WW-II experience. Combined with price controls to control potential inflation, rationing is fair to all and past experience is that it had popular support. A reference with ample documentation is Stan Cox (2013) “Any way you slice it: The past, present and future of rationing,” very readable.
Eric Swanson says
#30 Mark Goldes – The “AESOP Institute” appears to be a scam site.
#27 Mitch said For David Appell. Humans currently produce 10 GtC (30 GtCO2) per year, while ocean and terrestrial processes remove about 5 GtC. If we stabilize our CO2 output now, we would get roughly 2 ppmV increase in CO2 for each year that we continue emitting. In order for atmospheric CO2 level to stabilize, we need to cut emissions roughly in half.
[Response: Initially, yes. But then the CO2 sinks start to saturate, they stop taking up so much, and you need to cut the emissions further. -Stefan]
Given we are expecting up to 10ft. SLR already, and all the other baddies going on, why even talk about these things other than to address the math? The only safe and sane choice is to return to sub-300ppm. I will say this again, and would like it if a site owner would respond to this point: The poles started melting when we were at roughly 315 ppm, which if we consider climate lag times, equals melting beginning at 300 ppm.
How can anything else be thought of as safe?
#31 Jonathan Richards said The reality is we do not “have a budget” and never did. To claim that we do is to say that “we can safely kill xx number of humans, and acidify the world’s oceans to xx levels” while also falsely claiming that surface temperatures can be stabilized for civilization because “we will stop emitting more carbon”.
We have not stopped emitting carbon, and probably can’t. Any “budget” is in effect, too much carbon as the evidence actually shows.
As I have said for too many years. Talk of a carbon budget is a risk assessment not worth the paper it is written on. That said, I would never tell scientists to stop trying to understand the system. I do wish, however, they would say when announcing new research, or responding to it, whether an issue is of merely scientific significance or is also significant to policy discussions. The paper in question and all responses fall in the former category.
Someone say so. Unambiguously, publicly. There is no damned carbon budget. I am telling you as a systems designer, it is already nearly impossible to design a homestead for anyone anywhere and be confident it will not need complete revision in as little as ten years.
What you are saying there is quite frankly very misleading and obnoxious towards current scientific observations and predictions. All of the effects of global warming have been foretold for almost 5 decades in advance (if memory serves, the first climate model was created sometime back during the late 60’s), and was able to compute current warming to a very high level of accuracy.
See, what YOU are doing is causing inaction. By twisting the realities and scientific facts which ordinary people and scientists know (and have known, again, for decades), it effectively cuts off potential efforts and research used to help solve this crisis.
Stop spouting your alarmism and please think rationally about current events, without the panicked undertones; it is understandable to be anxious, but not to fearmonger.
Alastair McDonald says
BPL 8: It wasn’t democracy that elected Trump. That election was pretty heavily interfered with.
AM: And the climate debate is being even more heavily interfered with e.g. Climategate.
Kevin McKinney says
Re various comments on electoral interference in the USA:
By far the most significant dimension of this was the activity of the “Kochtopus”–the ecosystem of think-tanks, academic endowed programs, and ‘weaponized philanthropy’ developed over the last 35 years, and specifically aimed at molding the political system of the US into something pleasing to highly conservative elites (and, not coincidentally, their business interests.)
This complex is known to have spent amounts on recent election cycles that are comparable to, and sometimes much greater than, the money spent by the traditional parties. And reportedly, their use of voter data is also superior, certainly to that of the GOP, who basically gave up and started using the Koch info. Needless perhaps to say, many of these folks are in the fossil fuel business, or other businesses benefiting from less environmental protections.
Short version: Trump isn’t a fluke; he is a product.
All this is documented at book length in Jane Mayer’s “Dark Money” (and in other sources as well.)
Dan H. says
Yes. We do not need to eliminate our emissions entirely. We just need to reduce them to meet the terrestrial removal, which increase with increasing atmospheric concentrations. We can stabilize at a higher level.
Donna Albert says
Something I don’t see discussed: Model results tend to be expressed in probabilities. If I’m remembering correctly, the modeling results the authors of the study are relying on to calculate the CO2 budget gives a 67% chance of staying below the given amount of warming. So are we all OK with a 2 out of 3 chance of achieving the goal? What are the consequences at a probability of 1 out of 10? 3 degrees and runaway feedbacks? What consequences at 1 in 100? I would not get on a plane with a 1 in 100 chance of crashing. All of humanity is on this plane, and I suspect our risk is already unacceptable. I am with those who are calling for us to step on the brakes hard. I have stopped following climate studies closely and focused more on communicating the urgency of stopping the burning of fossil fuels as soon as possible.
Solar Jim says
We have so far emitted around two trillion tons of what historically has been called “carbonic acid gas” and currently emit around forty billions tons annually – not to mention the current buildout of “clean safe natural gas,” i.e. fracked fossil methane (a gangplank to the future?).
Agree with Mr. Richards (31) that a “budget” for 1.5C has already been exceeded – due e.g. to “thermal response factor” (Hansen), present aerosol dimming, further emissions during energy transition, unfolding climate feedbacks and planetary response to actual total carbon dioxide eguivalent.
“Budget” angels on a pinhead, but thanks for the post and regards.
The methodology used in Otto et al. 2015, from what I can tell, seems inadequate at removing bias, particularly if your data set ends with a strong El Nino year. From what I can tell, the statistical model used (sorry if I’m mistaken about this) doesn’t allow for autocorrelation of residuals and simply treats El Nino and all other forms of internal variability as white noise.