This month’s open thread. Try to keep it at least vaguely focused on climate science…!
> [Response: I’ve extended it to the last 10 responses… perhaps that will help. – gavin]
Thanks! I too look at the annotated comments as a heuristic for interesting.
New post on permafrost at ClimateSight (that woman is a marvel). I include an extract as it mentions a much debated corner of emissions and consequences:
This paper went in my mental “oh shit” folder, because it made me realize that we are starting to lose control over the climate system. No matter what path we follow – even if we manage slightly negative emissions, i.e. artificially removing CO2 from the atmosphere – this model suggests we’ve got an extra 0.25°C in the pipeline due to permafrost. It doesn’t sound like much, but add that to the 0.8°C we’ve already seen, and take technological inertia into account (it’s simply not feasible to stop all emissions overnight), and we’re coming perilously close to the big nonlinearity (i.e. tipping point) that many argue is between 1.5 and 2°C. Take political inertia into account (most governments are nowhere near even creating a plan to reduce emissions), and we’ve long passed it.
Any comments on this?
Later this week, the Royal Society is hosting a Workshop on Handling Uncertainty in Weather and Climate Prediction, With Application to Health, Agronomy, Hydrology, Energy and Economics.
10.10 Climate models: fit for what purpose?
Professor Judy Curry, Georgia Tech, Atlanta, USA
Climate models are being used to support emissions reduction policies and as the basis for projection of future regional climate variation for use in model-based decision support systems. Largely motivated by these applications, priorities for climate model development are focused on increasing resolution and adding complexity in the context of fully interactive earth system models.
Arguments are presented that there is misguided confidence and “comfort” with the current climate models and projected developments that are not consonant with understanding and best practices from other fields. The fitness for purpose of climate models is examined in the context of different decision making applications for which climate models are being used. The decision analytic framework of reducing scientific uncertainty in support of optimal decision making strategies regarding CO2 mitigation has arguably resulted in unwarranted high confidence in future projections and relative neglect of natural climate variability and the possibility of black swans and dragon kings.
The climate change problem is characterized by high levels of uncertainty, and modeling and subjective judgments substitute extensively for estimates based upon experience with actual events and outcomes. Hence a decision analytic framework of ‘decision making under deep uncertainty’ is a much better match to the climate problem, where understanding uncertainty and areas of ignorance is critical information for the decision making process. Potential applications of climate models under different decision analytic frameworks are described, motivating consideration of alternative paths for climate model development.
[Response: A combination of the obvious combined with unsourced over-generalisations and containing no constructive suggestions. What is she actually proposing that is different from what is already happening? Specifics would be more interesting to assess. (For reference, some of my opinions on what climate modelling should be doing to be more useful are here (reg. reqd) or here). – gavin]
#50–Ah, yes, McI–I remember him. Sounds so plausible, if you read fast and don’t know the context.
Gave him up, as I never found that he actually advanced my understanding–of anything.
Judy Curry (shorter): Uncertainty. Ooga-booga. Ooga-booga.
I always marvel that those counseling complacency somehow think uncertainty favors their position. Uncertainty cuts both ways, and the blade on the worst-case side is a whole lot sharper than on the best-case side.
Uncertainty is nobody’s friend.
“What is she actually proposing that is different from what is already happening?”
Well, she says that climate science suffers from “…relative neglect of natural climate variability…”
So obviously you climate modeling people need to quit neglecting natural climate variability because obviously, if you were to do so, you’d find that climate sensitivity to a doubling of CO2 is only about 1C. Then we could comfortably decide to kick the can down the road for at least another couple of generations.
I think her point is bleedin’ obvious. :)
And, remember … uncertainty only cuts one way! Anything that’s uncertain drops the CO2 sensitivity range further below the current model estimates! Always lower, never higher …
Hopefully there will be some credible scientists at the workshop. What was the organizer thinking?
Yesterday’s sea ice extent was 4166563 sq km http://www.ijis.iarc.uaf.edu/en/home/seaice_extent_prev.htm
It has been recovering from its mid-September minimum. Back then I estimated what kind of recovery rate would be needed if the minimum were broad. http://www.realclimate.org/index.php/archives/2012/08/an-update-on-the-arctic-sea-ice/comment-page-7/#comment-250169
Things are a little different than that so here is an updated calculation: If the recovery were to immediately steepen to arrive at the average extent for the previous 10 years on Oct 28 when track converge, it would be at a rate 1.17 times slower than the fastest 7 day recovery rate in the previous ten years. If the recovery that happens next week is that same as in the last seven days, followed by 19 days of rapid recovery to converge, the steep recovery rate would have to be 1.09 times faster than the fastest prior 7 day rate. So, we are either flirting with a need for a record fast recovery rate, or convergence with prior tracks may be delayed into November. We might expect the first case since there is more open ocean in which recovery can occur in which case the energy release rate from the latent heat is interesting. The second case is also interesting since it might herald a bunch of Edmund Fitzgerald-type November wrecks as people try to take advantage of a pattern of increasingly delayed ice recovery for shipping. Surely, the skies of November will remain gloomy on that lake….
Susan @ 52 and others who might be interested…SkS has a discussion on permafrost carbon feedback as well: http://www.skepticalscience.com/Macdougall.html
Re #43 Chris,
I think we will see storms driven by the temperature differential between the ice and the open water that will cause substantial ice loss. In addition, we are seeing latent heat imported from the south that adds to ice loss. Bottom line, I expect the Arctic to be substantially ice free in 2013.
Joining in thanks for
the responses from Gavin et al., and the expanded version
The description of Curry’s talk “Climate Models: Fit For What Purpose? pasted @53 is as clear as mud. So is a note-like account that Curry blogged yesterday here.
To get a better undserstanding, a Curry post of March of her DOE BERAC talk appears to cover the ground & includes such juicy quotes as “ It is not at all clear that GCMs will be able to generate counterintuitive, unexpected surprises. The current GCM’s have become ‘too stiff’,” this in the context of dragon-king-ridden black swans potentially fluttering themselves overhead.
[Response: This is unfalsifiable. Whatever the models produce one can always claim that they should be capable of producing something more. When it comes to actual tests of the models against the real world, the only example I can think of is the rapidity of the re-starting of NADW after a collapse has yet to be reproduced in a standard coupled GCM. This isn’t however a statement about all kinds of GCM behaviour. – gavin]
Thanks Gavin. Yeah, I didn’t see a way to hack a URL together in the code, but expanding the list helps and seems to have made a few of us happy at least. The extra posts in the sidebar definitely help, appreciate it.
Susan Anderson #52,
” It doesn’t sound like much, but add that to the 0.8°C we’ve already seen, and take technological inertia into account (it’s simply not feasible to stop all emissions overnight), and we’re coming perilously close to the big nonlinearity (i.e. tipping point) that many argue is between 1.5 and 2°C”
I have argued on other threads […. edit – yes. no need to repeat yourself in every thread]
Were I a cynic, I’d say that Professor Curry treats every real world problem she sees as similar to an academic departmental fracas, i.e. apply a thick coating of portentous words and delay, delay, delay.
On “Uncertainty is not your friend” I’ve always like this one:
“Uncertainty means things could be worse than anticipated”
Our discussion on culpability is core to the response to climate change. “We need to reduce GHGs” inherently results in a discussion of who reduces GHGs and why. Your initial stance is that current emissions are a hard-line base-line. If you own the largest stretch limo, then you are a saint if you buy a slightly smaller stretch limo or one with expensive tech that reduces its emissions a tad, while if you are a guy on a bicycle, buying a scooter makes you satanic. I found that stance immoral and illogical.
And yes, China is a nation of folks migrating towards mere poverty. Yep, there’s some Rich Dudes with 100,000 serfs supplying them the funds to own similar stretch limos as are owned by the owners of the west, but the average Chinese dude either lives on a postage-stamp farm and rides a bicycle, or sleeps in a windowless(?) dorm while recuperating from the effort of supplying us with iphones et al. China’s carbon emissions mostly belong to us. China’s industries emit CO2, but the beneficiaries of those industries – well, are they you and I who watch the big TV, or the exhausted serf who survives on rice in a dorm?
The issue is HUGE. We have the TVs, yet the average dude of any country by definition (if you believe in the words of the Founding Fathers) is just the average dude, and so can claim no more than the average dude of another country.
I appreciate your efforts to quantify, but I think the target is more nebulous and fleeting than that which deserves your attention. MPGe is opinion, not fact. We’ll spew funds towards biofuels and electrons, but nobody knows which will be the better investment.
Apropos China, [edit – how is this even vaguely related to climate science? This is not a general purpose forum for discussing anything you want.]
With all this subtle dissing of bicycles, I wonder even more how we can turn bike riding into a normal, much less high status activity. (I’m a bicycle mechinic and that is the norm in my circle of friends.)
“China’s carbon emissions mostly belong to us.” If that is you position, it would seem carbon tariffs are the only approach we have to cutting our emissions given that we have the entire Pacific Northwest for emissions free manufacturing whence our supply of manufactured goods might shift. We don’t have sovereignty over China so I make no claim to their emissions myself anymore. I used to think as you, but then realized each country must implement its own emissions policy.
I support carbon tariffs to pay for adaption. I think we need cooperation for mitigation not unilateral reprisals. You position, claiming responsibility for the actions of others, means you must dictate mitigation or forego it I think.
My edited post regarded Peter Calthorpe’s recent “Foreign Policy” article on the astonishingly rapid past and projected growth of automobile infrastructure in China, and the article’s bizarre omission of any climate change context, despite Calthorpe being the author of “Urbanism in the Age of Climate Change.” I would like to see some discussion of the projections referenced in this article, because they seem to preclude all but the most drastic climate change scenarios. The post is relevant to a number of ongoing topics on this thread, including 1) passenger vehicles, 2) psychological resistance to paradigm shift, and 3) gross underestimation of the rate and impacts of climate change.
CL at #45 wrote: “As ice volume decreases, the fraction of volume which is new ice increases, and hence the year to year variability in new ice becomes a larger fraction of the total ice volume variability, so I don’t think the smoothed downward slope will stay as smooth, i.e. you should expect bigger surprises to the upside on a given winter if it is cold and has heavy snow fall.”
Well put. But of course this is only one element of a complex dynamic. I certainly think there will be some kind of a ‘tail’–if nothing else, continuing (and probably accelerating) calving from GIS will continue to dump large chunks of ice into the Arctic, though these may get transported rather quickly through the Fram strait.
But wider expanses of open waters in the summer mean vast areas where winds can whip up large waves and (as we have seen this year) where huge cyclonic storms can be generated. These will likely act to mix the previously strongly stratified waters of the Arctic, so that the warmer saltier layer below will get stirred into the fresher surface layer, the salt and warmth hindering ice formation. And of course there are other dynamics to account for–too many to go into here.
It is devilishly difficult to figure out how exactly these dynamics will develop and which will predominate and which will feed off which. That is, on the one hand, why people try to develop models, and on the other hand, why models often fail (and why I can’t cast much blame on those who have failed to adequately model such a complex system).
Just one more point here–you rightly mention the dominance of new ice as a portion of the total ice. But that new, thin, salty ice will also melt all that much more easily, so I would think that, when it comes to measuring minimums, we could continue to see precipitous drops for the next few years, even as extent and area continue to recover fairly well for the maximums for a while. (And I still can’t see how a newly open and increasingly warm summer Arctic Ocean won’t produce more water vapor, vapor whose GHG properties will further accelerate Arctic warming–or is that completely offset by increased cloud formation??)
Anyway, thanks for your response to what seems to be the pressing issue to be discussed here.
“ It is not at all clear that GCMs will be able to generate counterintuitive, unexpected surprises. The current GCM’s have become ‘too stiff’,”
That’s just bizarre. I suppose we shouldn’t trust the complex models built by, say, Boeing to guide 787 development because they’re incapable of generating counterintuitive, unexpected surprises like, oh, cutting back thrust suddenly causing the aircraft to launch itself into near earth orbit. Or whatever.
Personally, I think it’s great that physical models are bounded by physics …
Judith makes this statement:
The current focus on the precautionary principle and optimal decision making is driving climate model development & applications in directions for which they are not fit.
Gavin, you can certainly answer true or false as to the accuracy of this claim regarding GISS Model E, at least.
What role does the focus on the precautionary principle have in the driving of GISS Model E development? And “optimal decision making”?
[Response: That’s a softball question if there ever was one. ;-) Easy, I can safely say that the precautionary principle has not played any role in GISS ModelE development. None. Nada. Zilch. However, I don’t know what is wrong with making decisions optimally. The work we’ve done on that – for instance as seen in the Shindell et al, 2012 paper has used developments in the model that we’re done for pure science interest. – gavin]
What up with raw data?
There is an interesting letter (in oz) going around regarding open data and data access etc.
Here it is:
I’m just wondering where do you draw the line? I think for most science experiments the requirement to supply ‘all the raw data’ is quite frankly, ridiculous. The raw data process in itself should be done again, as it is not immune to experimental error. All that should be required is the methodology, so it can be reproduced.
However, I’m kinda torn here because in many cases (especially with climate science) the ability to ‘reproduce’ something is not constrained by the basic scientific skill required to repeat the method, but rather by the finance or resources required (launching a satellite?)to get the raw record of data in the first place.
I can understand some maybe a bit precious about their hard work and data collection, especially since ‘raw data’ is hard to define, but I think if it is tax payers dollars at work, and the cost to collect the raw data is over a million dollars, then it should be made available, especially in the specific case of ‘data sets’ (such as proxies) where the primary focus of the methodology is ‘data collection and processing’.
What is RC position (if you have one)?
[Response: We have discussed what we look for in replicability previously. With respect to this letter, I’m not really sure what specifically they are asking for – sounds nice though. – gavin]
That’s a softball question if there ever was one. ;-)
I know, asking it was just a tactic to draw attention to the … oddness … of her claim. :)
Someone should keep a database of Classic Curryisms …
Re carbon tariffs #70 on Chinese goods I don’t think this is too harsh on a developing country. A carbon tariff shares the pain because the customer in the West pays more and China sells less. When China adopts a comparable CO2 penalty scheme then the tariff can be dropped. Their protests over the EU airline tax suggests they aren’t yet serious about CO2 cuts. We’re talking about a third of world emissions, much of it on behalf of the West.
Ms Curry is a master at puffy statements that lack specificity, and she almost never talks science. What is the direction and the applications that don’t fit? Fit What?
Re: model uncertainty
Is it still the case that the current crop of models have difficulty heating the poles in paleo hindcasts as much as paleo temperature proxies suggest actually happened in previous thermal excursions ?
Ms Curry is a master at puffy statements that lack specificity, and she almost never talks science.
She’s also a master at making suggestions as to what other people should do (the focus of her talk appears to be recommendations for future development of GCMs to make them “more useful”), even though she quite clearly doesn’t understand what those other people are actually doing today …
Methane Emissions Can Be Traced Back to Roman Times
which I found rather amazing.
Dr Curry says nothing and gets all the attention. If she would have brought up an ice model animation to study she would have been awesome. Some models with Global temperature were right since the 80’s, so they are clever, brilliant and needed. Too bad I cant say the same about sea ice models, simply because I can’t watch them animate. We do have some graphs, they fail the ultimate test of the future. Its not fun to contemplate failure every time these graphs are brought up. It hurts the reputation of all computer models when one flunks so bad. Instead of saying “oh Judith lectured this again” , what about displaying a sea ice model animation? Lots of guys would make insightful comments and it would place the focus on repairing this sight for sore eyes. By the way, fake skeptics have used this failure very usefully for their purposes.
I don’t want to undermine what President Obama has achieved in getting China to agree to cuts. For the present, carbon tariffs to cover adaptation costs such as higher crop insurance premiums would be pretty mild since the big adaptation costs are not upon us yet (and attribution is really only solid for summer heatwaves so far). But, it would be good to get the pump primed now since, absent mitigation, those costs will mount.
Johnno @ 77,
What are China’s PER CAPITA CO2 emissions vis-a-vis various Western countries? I suspect a lot less, even given their rapid (and dirty) industrial development over the last few decades.
There is an article in the L. A. Times http://www.latimes.com/news/science/la-sci-humans-climate-change-20121004,0,2962982.story about a paper in October 4, 2012 Nature by Sapart et al, “Natural and anthropogenic variations in methane sources during the past two millennia” doi:10.1038/nature11461 (sorry, have not read the paper). Denier commenters are all over this article.
Some commentary would be appreciated.
1. Starting with basic combustion chemistry. Am I correct in thinking that the “pyrogenic” methane is caused by incomplete combustion?
[Response: yes. Methane is emitted by biomass burning – though it’s conceivable that increased irrigation also played a role. – gavin]
2. How significant is this paper?
[Response: not hugely. It might improve the earlier forcings a little, but it’s really a small impact on climate – probably not detectable, and certainly will not lead to any significant change in climate modelling. – gavin]
In the example you used: “Both of these papers were based on analyses of publicly available data”.
I think the point of the “letter” is that there is a proportion of raw data which has been collected by scientists who are funded by the public… and this data is not publicly available (i.e. they are keeping it to themselves, it’s a competitive world out there!).
Shouldn’t all publicly funded data eventually (2 years after the results are published?) be made public?
After all it is your conclusion:
“..the vast majority of papers that turn out to be wrong, or non-robust are because of incorrect basic assumptions, overestimates of the power of a test, some wishful thinking, or a failure to take account of other important processes..”.
Wouldn’t that mean that the publication of raw data is actually more important than the papers (and data sets) they are based on?
You could still have a two year embargo on the raw data, with the option of time extension, because you never know, the authors could make an embarrassing mistake. Then after some time, everyone gets it.
What are your thoughts on this?, I think there needs to be a monetary line, say one million dollars, any raw data that has chewed up over one million dollars must be made public. I don’t think cost should be an issue (although it could make it more expensive).
[Response: The problem is that the letter doesn’t discuss anything specific where the different issues could be examined. I work for NASA and all our satellite data is public domain and there is no problem either with the people who produced it publishing papers or other people using it anyway they want. This is similar for climate model output for IPCC etc. But I have no idea whether this is the kind of thing the letter writers want from ARC, or whether they want something much further along the lines of open notebooks as the science is being done, or just more complete SI or something else. All of the devils are in the details – and it varies enormously among fields. – gavin]
Re: dhogaza @ 76
See curryquotes.wordpress.com. Someone started it, but looks like they gave up (one can only guess why… :)
Apologies if these have already been discussed and I missed it:
“In Wake of Sea Ice Loss, Focus on New Models, Melt Ponds”
Could an infrared laser be used to actually observe IR photon behaviour in different GHGs and atmospheric mixes, using femto-photography? Not that I’m sure it would be of any use. Awesome TED Talk, though.
Ramesh Raskar: Imaging at a trillion frames per second
Concerning Curry & GCMs.
Intrigued by the thought of the paradox of predictable black swans and the emergence of previously invisible dragon kings, and how these may come to garnish a Royal Society presentation, I thought to look a little further at the work of Professor Curry.
Curry begins her 2010 thesis What can we learn from climate models? by saying “Short answer: I’m not sure.” This is perhaps a strange answer given she explains that her concern for the subject stretches back decades and she styles herself “a serious (climate model) monster detective.”
Curry’s 2010 thesis begins with a sweeping account of difficulties GCMs have to overcome regarding the ‘pandemonium’ that is global weather, but then conflates this serious issue with the purpose of a GCM,‘ inadequacy’ and with model structure.
In conclusion, the thesis advocates that GCMs be used and developed uncompromisingly for “Hypothesis testing, numerical experiments, to understand how the climate system works, including its sensitivity to altered forcing,” such a policy to continue until climate model building becomes better understood. And a final poke at GCMs is suggestive that philisophically science is kidding itself on the usefulness of GCMs.
So Curry’s 2010 thesis is strong stuff but lacking substance, is little more than a rant. The discussion of “truthiness” in her 2012 DOE BERAC talk flows from this stance, having been apparently & bizarrely coupled with rising blogoshere discomforts (at ClimateEtc. & ClimateAudit). Her shift from ‘GCM work should be focused’ of 2010 to the 2012 ‘IPCC GCM work dominates climatology. This (& some other juicy stuff) is bad.’ – this shift is now a lot more understandable.
What is absent from this writing is evidence of that “serious (climate model) monster detective” who could possibly be still embroiled in the death of the skydragon (hat tip – joe@86).
To ozajh @84, according to this, China’s per capita CO2 emissions have reached EU levels:
So they have caught up with some ‘Western’ levels, though they still aren’t up to US per capita levels.
For those of a data availability persuasion we have just released a beta of a new global land surface databank consisting of over 39,000 station records at monthly resolution. More details are available at http://surfacetemperatures.blogspot.com/2012/10/beta-release-of-first-version-of-global.html and at http://ncdc.noaa.gov/news/beta-release-global-land-surface-databank and links therefrom. Constructive criticism and suggestions are welcome via the Initiative blog.
hat tip to Metafilter for this:
… This year’s topic was “our global oceans,” and featured David Gallo David Gallo (Director of special projects, Woods Hole Oceanographic Institution), Christopher Sabine (Nobel peace prize in 2007, shared with other members of IPCC), Barbara Block, William Fitzgerald, Ove Hoegh-Guldberg, Kathleen Dean Moore, Carl Safina, and Maya Tolstoy. The 48th Nobel Conference talks are all on youtube.
On Metafilter the header is: Science-for-the-masses
I wish that I were struggling to stay focused on climate science today rather than commenting on the vigorous discussion in last night’s US presidential debate about the ongoing, huge and escalating economic impacts of climate change …
Of course, there’s no need to struggle, since there was not one word on the subject spoken by Lehrer, Obama or Romney.
Not. One. Single. Word.
re: 81 85 anthropogenic methane
1) See Special issue of The Holocene on Early Anthropogenic Hypothesis. Read abstracts of Fuller, et al, Kaplan, et al, Nevle, et al, and Ruddiman, et al.
2) See also Multidecadal variability of atmospheric methane, 1000–1800 C.E. by Mitchell, et al
‘We present a new high-precision, high-resolution record of atmospheric methane from the West Antarctic Ice Sheet (WAIS) Divide ice core covering 1000–1800 C.E., a time period known as the late preindustrial Holocene (LPIH). The results are consistent with previous measurements from the Law Dome ice core, the only other high-resolution record of methane for this time period, and confirm most of the observed variability. Multidecadal variability in methane concentrations throughout the LPIH is weakly correlated or uncorrelated with reconstructions of temperature and precipitation from a variety of geographic regions. Correlations with temperature are dominated by changes in Northern Hemisphere high latitude temperatures between 1400 and 1600 C.E. during the onset of the Little Ice Age. Times of war and plague when large population losses could have reduced anthropogenic emissions are coincident with short periods of decreasing global methane concentrations.’
3) Put the new Sapart et al paper in Nature together with the above.
4) I’d claim this combination increasingly supports Bill Ruddiman’s (related) hypotheses:
a) Early anthropogenic effects from land-use and animal husbandry, starting thousands of years ago: long term, relatively smooth additions of anthropogenic CH2/CH4. Land-use assumptions matter and the evidence seems to be accumulating that earlier agriculture used more land/capita, and if so, the numbers work OK.
b) CO2(CH4) sharp jiggles in last millennium especially, caused by plagues (and maybe wars with CH4), after population was large enough for such to have effects.
The sharp drop of CH4 shown in Mitchell et al and Sapart et al (fig.3) into 1600AD (before Maunder Minimum), and fairly well aligned with CO2 drop comprise the most obvious example. I think Nevle, et al make a good case that this was mostly an effect of a 50M person die-off in the Americas and consequent reforestration.
5) I think all this matters because it helps calibrate natural variability versus anthropogenic, i.e., it bears on the general attribution problem. The 1600AD event seems fast enough to induce short-term feedbacks without much influence from longer ones. As a flow rather than a stock, CH4 offers some higher-frequency info.
All this has a heavily-multidisciplinary aspect, wit studies of different kinds of charcoal, rice-paddy archaeology, climate reconstructions, etc. Relevant references are scattered all over the place.
I’d appreciate the RC’s resident sea level rise consultant(s) thoughts on ‘Millennial total sea-level commitments projected with the Earth system model of intermediate complexity ‘
The coverage I’ve seen to date seems to miss that the long term projections are based on ‘stabilising’ the atmosphere post-2100, and as I read it, stabilisation means more-or-less ceasing emissions by then.
Caerbannog , Eli will be in touch about exporting this software package to Byte columnist and ur-blogger Jerry Pournelle of Chaos Manor, , in hope of enlarging some actual skepticism about Watt’s claims among his readers
Re: 70 & 77 The last thing I heard was that China had proposed carbon tariff trials commencing in 2014. Albeit I’ve heard nothing since Australia’s tariff debate went quiet (due to the lack of falling skys when it was introduced?)
John, this is fascinating seeing the very widely scattered work make sense seen together. Do you know if there are any other ice caps potentially usable for longterm cores? Are e.g. Lonnie Thompson’s saved up cores capturing the methane signal ( I know the core samples were hauled out of remote sites in various ways, from taking out separate vials of meltwater on donkeys, I think, to flying out permanently frozen cylinders in portable freezers)
Wondering if there are potential targets that need to be cored with newer techniques to capture any info before it goes away.
(Reminds me of what they say about N. California — that the ecologists and taxonomists mostly have worked just ahead of the bulldozers for decades now, so they did the coastline, then the Sierra, but skipped the middle of California where there’s now urgency about seeing what’s there before it’s gone)
> > [Response: I’ve extended it to the last 10 responses…
“Please sir, may I have a little [More]?”
Mathematical notation provided by QuickLatex
Powered by WordPress
Switch to our mobile site