RealClimate logo

Can planting trees save our climate?

Filed under: — stefan @ 16 July 2019

In recent weeks, a new study by researchers at ETH Zurich has hit the headlines worldwide (Bastin et al. 2019). It is about trees. The researchers asked themselves the question: how much carbon could we store if we planted trees everywhere in the world where the land is not already used for agriculture or cities? Since the leaves of trees extract carbon in the form of carbon dioxide – CO2 – from the air and then release the oxygen – O2 – again, this is a great climate protection measure. The researchers estimated 200 billion tons of carbon could be stored in this way – provided we plant over a trillion trees.

The media impact of the new study was mainly based on the statement in the ETH press release that planting trees could offset two thirds of the man-made CO2 increase in the atmosphere to date. To be able to largely compensate for the consequences of more than two centuries of industrial development with such a simple and hardly controversial measure – that sounds like a dream! And it was immediately welcomed by those who still dream of climate mitigation that doesn’t hurt anyone.

Unfortunately, it’s also too good to be true. Because apples are compared to oranges and important feedbacks in the Earth system are forgotten. With a few basic facts about the CO2 increase in our atmosphere this is easy to understand. Mankind is currently blowing 11 billion tonnes of carbon (gigatonnes C, abbreviated GtC) into the air every year in the form of CO2 – and the trend is rising. These 11 GtC correspond to 40 gigatons of CO2, because the CO2molecule is 3.7 times heavier than only the C atom. Since 1850, the total has been 640 GtC – of which 31 % is land use (mostly deforestation), 67 % fossil energy and 2 % other sources. All these figures are from the Global Carbon Project, an international research consortium dedicated to the monitoring of greenhouse gases.

The result is that the amount of CO2 in our air has risen by half and is thus higher than it has been for at least 3 million years (Willeit et al. 2019). This is the main reason for the ongoing global warming. The greenhouse effect of CO2 has been known since the 19th century; it is physically understood and completely undisputed in science.

Room for more trees? Sheep grazing on deforested land in New Zealand. (Photo S.R.)

But: this CO2 increase in the air is only equivalent to a total of just under 300 GtC, although we emitted 640 GtC! This means that, fortunately, only less than half of our emissions remained in the atmosphere, the rest was absorbed by oceans and forests. Which incidentally proves that the CO2 increase in the atmosphere was caused entirely by humans. The additional CO2 does not come from the ocean or anywhere else from nature. The opposite is true: the natural Earth system absorbs part of our CO2 burden from the atmosphere.

Conversely, this also means that if we extract 200 GtC from the atmosphere, the amount in the atmosphere does not decrease by 200 GtC, but by much less, because oceans and forests also buffer this. This, too, has already been examined in more detail in the scientific literature. Jones et al. 2016 found that the amount of carbon removed from the atmosphere amounts to only 60% or less of the negative emissions, when these are implemented on the background of a mitigation scenario (RCP2.6).

We can also compare the “negative emissions” from tree planting to our other emissions. The 200 GtC would be less than one third of the 640 GtC total emissions, not two thirds. And the authors of the new study say that it would take fifty to one hundred years for the thousand billion trees to store 200 GtC – an average of 2 to 4 GtC per year, compared to our current emissions of 11 GtC per year. That’s about one-fifth to one-third – and this proportion will decrease if emissions continue to grow. This sounds quite different from the prospect of solving two-thirds of the climate problem with trees. And precisely because reforestation takes a very long time, it should be taboo today to cut down mature, species-rich forests, which are large carbon reservoirs and a valuable treasure trove of biological diversity.

There is another problem that the authors do not mention: a considerable part of the lands eligible for planting are in the far north in Alaska, Canada, Finland and Siberia. Although it is possible to store carbon there with trees, albeit very slowly, this would be counterproductive for the climate. For in snowy regions, forests are much darker than snow-covered unwooded areas. While the latter reflect a lot of solar radiation back into space, the forests absorb it and thus increase global warming instead of reducing it (Bala et al. 2007, Perugini et al. 2017). And increased regional warming of the Arctic permafrost areas in particular would be a terrible mistake: permafrost contains more carbon than all trees on earth together, around 1,400 GtC. We’d be fools to wake this sleeping giant.

And there are other question marks. Using high-resolution satellite maps and Google Earth, the researchers have analyzed where there is a suitable place for forests where none is currently growing, leaving out farmland and cities. With the help of machine learning technology, natural areas around the world were evaluated to determine the climate and soil conditions under which forests can thrive. The free and suitable land areas found in this way amount to 1.8 billion hectares – as much as the combined area of China and the USA.

But for many of these areas, there are probably good reasons why there is currently no forest. Often they are simply grazing lands – the authors respond that they have only assumed loose tree cover there, which could even be beneficial for grazing animals. The Dutch or Irish pastures would then resemble a savannah. Nevertheless, there are likely to be considerable obstacles of very different kinds on many of these areas, which are not apparent from the bird’s-eye view of the satellites. The authors of the study also write that it is unclear how much of the areas found would actually be available for planting.

Therefore, I’d still consider it optimistic to assume that half of the calculated theoretical planting potential can be realized in practice. Then we’re talking of 1-2 GtC of negative emissions per year. But that is precisely what we will need urgently in the future. The current global CO2 emissions can be reduced by 80-90 % through transforming our energy, heating and transport systems – but there will remain a rest that will be hard get rid of (e.g. from agriculture, industrial processes and long-haul flights) and that we will have to offset in order to stabilize the global climate.

The study by the ETH researchers has another important result that has hardly been reported. Without effective climate protection, progressive warming will lead to a massive loss of existing forest cover, especially in the tropics. At the same time, the models are not yet able to make reliable statements on how forests can cope with new extremes, fire, thawing permafrost, insects, fungi and diseases in a changing climate.

Global warming threatens massive forest losses (red), especially in the tropics. Fig. 3 from Bastin et al., Science 2019

The massive planting of trees worldwide is therefore a project that we should tackle quickly. We should not do that with monocultures but carefully, close to nature and sustainably, in order to reap various additional benefits of forests on local climate, biodiversity, water cycle and even as a food source. But we must not fall for illusions about how many billions of tons of CO2 this will take out of the atmosphere. And certainly not for the illusion that this will buy us time before abandoning fossil fuel use. On the contrary, we need a rapid end to fossil energy use precisely because we want to preserve the world’s existing forests.


Would a large-scale tree restoration effort stop climate change? Forest expert Marcus Lindner from EFI points to the fires in Russia and the success story in China.

How to erase 100 years of carbon emissions? Plant trees-lots of them. National Geographic shows the importance of indigenous peoples as guardians of the forest.

Restoring forests as a means to many ends The commentary in Science on the Bastin study revolves around the question of how sustainable reforestation can be designed with multiple benefits beyond mere carbon storage.

Tree planting ‘has mind-blowing potential’ to tackle climate crisis Guardian

The International Meeting on Statistical Climatology

Filed under: — rasmus @ 6 July 2019

The weather forecast looks sunny and particularly hot from Sunday to Friday, with afternoon temperatures above 30°C every day, and likely exceeding 35°C by the middle of the week. One consequence is that the poster sessions (Tuesday and Thursday) have been moved to the morning as they will be held outside under a marquee.”


I have never received a notification like this before a conference. And it was then followed up by a warning from the Guardian: ‘Hell is coming’: week-long heatwave begins across Europe.


The heatwave took place and was an appropriate frame for the International meeting on statistical climatology (IMSC), which took place in Toulouse, France (June 24-28). France set a new record-high temperature 45.9°C on June 28th, beating the previous record 44.1°C from 2003 by a wide margin (1.8°C).


One of the topics of this meeting was indeed heatwaves and one buzzword was “event attribution”. It is still difficult to say whether a single event is more likely as a result of climate change because of model inaccuracies when it comes to local and regional details.


Weather and climate events tend to be limited geographically and involve very local processes. Climate models, however, tend to be designed to reproduce more large-scale features, and their output is not exactly the same as observed quantity. Hence, there is often a need for downscaling global climate model results in order to explain such events.


A popular strategy for studying attribution of events is to run two sets of simulations: ‘factual’ (with greenhouse gas forcing) and ‘counterfactual’ (without greenhouse gas forcings) runs for the past, and then compare the results. Another question is how to “frame” the event, as different definitions of an event can give different indicators.


Individual heatwaves are still difficult to attribute to global warming because soil moisture may be affected by irrigation wheras land surface changes and pollution (aerosols) can shift the temperature. These factors are tricky when it comes to modeling and thus have an effect on the precision of the analysis.


Nevertheless, there is little doubt that the emerging pattern of more extremes that we see is a result of the ongoing global warming. Indeed, the results presented at the IMSC provide further support for the link between climate change and extremes (see previous post absence of evidence).


I braved the heat inside the marquee to have a look at the IMSC posters. Several of them presented work on seasonal and decadal forecasting, so both seasonal and decadal prediction still seem to be hot topics within the research community.


A major hurdle facing decadal predictions is to design climate models and give them good enough information so that they are able to predict how temperature and circulation evolve (see past post on decadal predictions). It is hard enough to predict the global mean temperature (link), but regional scales are even more challenging. One question addressed by the posters was whether advanced statistical methods improve the skill when applied to model output.


A wide range of topics was discussed during the IMSC. For instance, how the rate of new record-breaking events (link) can reveal trends in extreme statistics. There was one talk about ocean wave heights and how wave heights are likely to increase as sea-ice retreats. I also learned how severe thunderstorms in the US may be affected by ENSO and climate change.


Another interesting observation was that so-called “emergent constraints” (and the Cox et al, (2018) paper) are still debated, in addition to methods for separating internal variability from forced climate change. And there is ongoing work on the reconstruction of temperature over the whole globe, making use of all available information and the best statistical methods.


It is probably not so surprising that the data sample from the ARGO floats shows an ongoing warming trend, however, by filling in the spaces with temperature estimates between the floats, the picture becomes less noisy. It seems that a better geographical representation removes a bias that gives an underestimated warming trend.

While most talks were based on statistics, there was one that was mostly physics-based on the transition between weather regimes. Other topics included bias-adjustment (multi-variate), studies of compound events (straining the emergency service), the connection between drought and crop yields, how extreme weather affects health, snow avalanches, precipitation from tropical cyclones, uncertainties, downscaling based on texture analysis, and weather generators. To cover all of these would take more space than I think is appropriate for a blog like this.


One important issue was about data sharing which merits wider attention. The lack of open and free data is still a problem, especially if we want to tackle the World Climate Research Programme’s grand challenges. European and US data are freely available and the Israeli experience indicate that open access is beneficial.

Unforced variations: July 2019

Filed under: — group @ 2 July 2019

This month’s open thread for climate science discussions.

Absence and Evidence

Guest commentary by Michael Tobis, a retired climate scientist. He is a software developer and science writer living in Ottawa, Ontario.

A recent opinion piece by economist Ross McKitrick in the Financial Post, which attracted considerable attention in Canada, carried the provocative headline “This scientist proved climate change isn’t causing extreme weather – so politicians attacked”.

In fact, the scientist referenced in the headline, Roger Pielke Jr., proved no such thing. He examined some data, but he did not find compelling evidence regarding whether or not human influence is causing or influencing extreme events.

Should such a commonplace failure be broadly promoted as a decisive result that merits public interest?

More »

Koonin’s case for yet another review of climate science

We watch long YouTube videos so you don’t have to.

In the seemingly endless deliberations on whether there should be a ‘red team’ exercise to review various climate science reports, Scott Waldman reported last week that the original architect of the idea, Steve Koonin, had given a talk on touching on the topic at Purdue University in Indiana last month. Since the talk is online, I thought it might be worth a viewing.

[Spoiler alert. It wasn’t].

More »

Unforced Variations vs Forced Responses?

Guest commentary by Karsten Haustein, U. Oxford, and Peter Jacobs (George Mason University).

One of the perennial issues in climate research is how big a role internal climate variability plays on decadal to longer timescales. A large role would increase the uncertainty on the attribution of recent trends to human causes, while a small role would tighten that attribution. There have been a number of attempts to quantify this over the years, and we have just published a new study (Haustein et al, 2019) in the Journal of Climate addressing this question.

Using a simplified climate model, we find that we can reproduce temperature observations since 1850 and proxy-data since 1500 with high accuracy. Our results suggest that multidecadal ocean oscillations are only a minor contributing factor in the global mean surface temperature evolution (GMST) over that time. The basic results were covered in excellent articles in CarbonBrief and Science Magazine, but this post will try and go a little deeper into what we found.

More »


  1. K. Haustein, F.E. Otto, V. Venema, P. Jacobs, K. Cowtan, Z. Hausfather, R.G. Way, B. White, A. Subramanian, and A.P. Schurer, "A limited role for unforced internal variability in 20th century warming.", Journal of Climate, 2019.

Unforced Variations: June 2019

Filed under: — group @ 3 June 2019

This month’s open thread for climate science discussions. Remember discussion about climate solutions can be found here.

Forced responses: May 2019

Filed under: — group @ 2 May 2019

A bimonthly open thread on climate solutions and policies. If you want to discuss climate science, please use the Unforced Variations thread instead.

Unforced variations: May 2019

Filed under: — group @ 2 May 2019

This month’s open thread about climate science topics. For discussions about solutions and policy, please use the Forced Responses open thread.

Nenana Ice Classic 2019

Filed under: — gavin @ 14 April 2019


Perhaps unsurprisingly given the exceptional (relative) warmth in Alaska last month and in February, the record for the Nenana Ice Classic was shattered this year.

The previous official record was associated with the exceptional conditions in El Niño-affected winter of 1939-1940, when the ice went out on April 20th 1940. Though since 1940 was a leap year, that was actually a little later (relative to the vernal equinox) than the ice out date in 1998 (which wasn’t a leap year). 

Other records are also tumbling in the region, for instance the ice out data at Bethel, Alaska:



While the trend at Nenana since 1908 has been towards earlier ice-out dates (by about 7 days a century on average), the interannual variability is high. This is consistent with the winter warming in this region over that period of about 2.5ºC.  Recent winters have got close (2012/14/15/16) (3 to 4 days past the record),  but this year’s April 14th date is an impressive jump (and with no leap year to help calendrically).

As usual, I plot both the raw date data and the version adjusted to relative to the vernal equinox (the official time of breakup was ~12:21am).

  [As usual, I predict that there will be no interest from the our favorite contrarians in this]