Loading...

Follow RealClimate on Feedspot

Continue with Google
Continue with Facebook
Or

Valid


This month’s open thread for climate science topics. Note that discussions about mitigation and/or adaptation should be on the Forced Responses thread.

Let’s try and avoid a Groundhog Day scenario in the comments!

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A new handbook on science communication came out from IPCC this week. Nominally it’s for climate science related communications, but it has a wider application as well. This arose mainly out of an “Expert meeting on Communication” that IPCC held in 2016.

6 principles to help IPCC scientists better communicate their work

6 principles to help IPCC scientists better communicate their work - YouTube

There was a Guardian article on it as well.

The six principles are pretty straightforward:

  1. Be a confident communicator
  2. Talk about the real world, not abstract ideas
  3. Connect with what matters to your audience
  4. Tell a human story
  5. Lead with what you know
  6. Use the most effective visual communication

Each is supported with references to the relevant literature and with climate-related (“real world”) examples that are themselves confidently communicated with effective visuals.

But what do people think? Is this a useful addition to the literature on communication? Anything you think doesn’t work? or that perhaps surprises you?

PS. I’m perhaps a little biased because they use a Peter Essick photo for their cover art that was also in my book.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The basic facts about the global increase of CO2 in our atmosphere are clear and established beyond reasonable doubt. Nevertheless, I’ve recently seen some of the old myths peddled by “climate skeptics” pop up again. Are the forests responsible for the CO2 increase? Or volcanoes? Or perhaps the oceans?

Let’s start with a brief overview of the most important data and facts about the increase in the carbon dioxide concentration in the atmosphere:

  1. Since the beginning of industrialization, the CO2 concentration has risen from 280 ppm (the value of the previous millennia of the Holocene) to now 405 ppm.
  2. This increase by 45 percent (or 125 ppm) is completely caused by humans.
  3. The CO2 concentration is thus now already higher than it has been for several million years.
  4. The additional 125 ppm CO2 have a heating effect of 2 watts per square meter of earth surface, due to the well-known greenhouse effect – enough to raise the global temperature by around 1 °C until the present.

Fig. 1 Perhaps the most important scientific measurement series of the 20th century: the CO2 concentration of the atmosphere, measured on Mauna Loa in Hawaii. Other stations of the global CO2 measurement network show almost exactly the same; the most important regional variation is the greatly subdued seasonal cycle at stations in the southern hemisphere. This seasonal variation is mainly due to the “inhaling and exhaling” of the forests over the year on the land masses of the northern hemisphere. Source (updated daily): Scripps Institution of Oceanography.

Fig. 2 The CO2 concentration of the atmosphere during the Holocene, measured in the ice cores from Antarctica until 1958, afterwards Mauna Loa. Source: Scripps Institution of Oceanography.

These facts are well known and easy to understand. Nevertheless, I am frequently confronted with attempts to play down the dangerous CO2-increase, e.g. recently in the right-leaning German newspaper Die Welt.

Are the forests to blame?

Die Welt presented a common number-trick by climate deniers (readers can probably point to some english-language examples):

In fact, carbon dioxide, which is blamed for climate warming, has only a volume share of 0.04 percent in the atmosphere. And of these 0.04 percent CO2, 95 percent come from natural sources, such as volcanoes or decomposition processes in nature. The human CO2 content in the air is thus only 0.0016 percent.

The claim “95 percent from natural sources” and the “0.0016 percent” are simply wrong (neither does the arithmetic add up – how would 5% of 0.04 be 0.0016?). These (and similar – sometimes you read 97% from natural sources) numbers have been making the rounds in climate denier circles for many years (and have repeatedly been rebutted by scientists). They present a simple mix-up of turnover and profit, in economic terms. The land ecosystems have, of course, a high turnover of carbon, but (unlike humans) do not add any net CO2 to the atmosphere. Any biomass which decomposes must first have grown – the CO2 released during rotting was first taken from the atmosphere by photosynthesis. This is a cycle. Hey, perhaps that’s why it’s called the carbon cycle!

That is why one way to reduce emissions is the use of bioenergy, such as heating with wood (at least when it’s done in a sustainable manner – many mistakes can be made with bioenergy). Forests only increase the amount of CO2 in the air when they are felled, burnt or die. This is immediately understood by looking at a schematic of the carbon cycle, Fig. 3.

Fig. 3 Scheme of the global carbon cycle. Values ​​for the carbon stocks are given in Gt C (ie, billions of tonnes of carbon) (bold numbers). Values ​​for average carbon fluxes are given in Gt C per year (normal numbers). Source: WBGU 2006 . (A similar graph can also be found at Wikipedia.) Since this graph was prepared, anthropogenic emissions and the atmospheric CO2 content have increased further, see Figs 4 and 5, but I like the simplicity of this graph.

If one takes as the total emissions a “natural” part (60 GtC from soils + 60 GtC from land plants) and the 7 GtC fossil emissions as anthropogenic part, the anthropogenic portion is about 5% (7 of 127 billion tons of carbon) as cited in the Welt article. This percentage is highly misleading, however, since it ignores that the land biosphere does not only release 120 GtC but also absorbs 122 GtC by photosynthesis, which means that net 2 GtC is removed from the atmosphere. Likewise, the ocean removes around 2 GtC. To make any sense, the net emissions by humans have to be compared with the net uptake by oceans and forests and atmosphere, not with the turnover rate of a cycle, which is an irrelevant comparison. And not just irrelevant – it becomes plain wrong when that 5% number is then misunderstood as the human contribution to the atmospheric CO2 concentration.

The natural earth system thus is by no means a source of CO2 for the atmosphere, but it is a sink! Of the 7 GtC, which we blow into the atmosphere every year, only 3 remain there. 2 are absorbed by the ocean and 2 by the forests. This means that in the atmosphere and in the land biosphere and in the ocean the amount of stored carbon is increasing. And the source of all this additional carbon is the fact that we extract loads of fossil carbon from the earth’s crust and add it to the system. That’s already clear from the fact that we add twice as much to the atmosphere as is needed to explain the full increase there – that makes it obvious that the natural Earth system cannot possibly be adding more CO2 but rather is continually removing about half of our CO2 emissions from the atmosphere.

The system was almost exactly in equilibrium before humans intervened. That is why the CO2 concentration in the air was almost constant for several thousand years (Figure 2). This means that the land ecosystems took up 120 GtC and returned 120 GtC (the exact numbers don’t matter here, what matters is that they are the same). The increased uptake of CO2 by forests and oceans of about 2 GtC per year each is already a result of the human emissions, which has added enormous amounts of CO2 to the system. The ocean has started to take up net CO2 from the atmosphere through gas exchange at the sea surface: because the CO2 concentration in the atmosphere is now higher than in the surface ocean, there is net flux of CO2 into the sea. And because trees take up CO2 by photosynthesis and can do this more easily if you offer them more CO2 in the air, they have started to photosynthesize more and thus take up a bit more CO2 than is released by decomposing old biomass. (To what extent and for how long the land biosphere will remain a carbon sink is open to debate, however: this will depend on the extent to which the global ecosystems come under stress by global warming, e.g. by increasing drought and wildfires.)

The next diagram shows (with more up-to-date and accurate numbers) the net fluxes of CO2 (this time in CO2 units, not carbon units!).

Fig. 4 CO2 budget for 2007-2016, showing the various net sources and sinks. The figures here are expressed in gigatons of CO2 and not in gigatons of carbon as in Fig. 3. The conversion factor is 44/12 (molecular weight of CO2 divided by atomic weight of carbon). Source: Global Carbon Project.

Fig. 5 shows where the CO2 comes from (in the upper half you see the sources – fossil carbon and deforestation) and where it ends up (in the lower half you sees the sinks), in the course of time. It ends up in comparably large parts in air, oceans and forests. The share absorbed by the land ecosystems varies greatly from year to year, depending on whether there were widespread droughts, for example, or whether it was a good growth year for the forests. That is why the annual CO2 increase in the atmosphere also varies greatly each year, and this short-term variation is not mainly caused by variations in our emissions (so a record CO2 increase in the atmosphere in an El Niño year does not mean that human emissions have surged in that year).

Fig. 5 Annual emissions of carbon from fossil sources and deforestation, and annual emissions from the biosphere, atmosphere and ocean (the latter are negative, meaning net uptake). This is again in carbon (not CO2) units; the 12 gigatons of carbon emitted in 2016 are a lot more than the 7 gigatons in the older Fig. 3. Source: Global Carbon Project.

The “climate skeptics” blaming the forests for most of the increase in atmospheric CO2, because of decaying foliage and deadwood, is not merely wrong, it is pretty bonkers. Have leaves started to decompose only since industrialization? Media with a minimum aspiration to credibility should clearly reject such nonsense, instead of spreading it further. In case of Die Welt, one of my PIK colleagues had explicitly pointed out to the author, in response to a query by him, that the 5% human share of CO2 is misleading and that humans have caused a 45% increase. That the complete CO2 increase is anthropogenic has been known for decades. The first IPCC report, published in 1990, put it thus:

Since the industrial revolution the combustion of fossil fuels and deforestation have led to an increase of 26% in carbon dioxide concentration in the atmosphere.

In the 27 years since then, the CO2 increase caused by our emissions has gone up from 26% to 45%.

How Exxon misled the public against better knowledge

One fascinating question is where this false idea of humans just contributing a tiny bit to the relentless rise in atmospheric CO2 has come from? Have a look at this advertorial (a paid-for editorial) by ExxonMobil in the New York Times from 1997:

Fig. 6 Excerpt from the New York Times of 6 November 1997

The text to go with it read:

While most of the CO2 emitted by far is the result of natural phenomena – namely respiration and decomposition, most attention has centered on the three to four percent related to human activities – burning of fossil fuels, deforestation.

That is pretty clever and could hardly be an accident. The impression is given that human emissions are not a big deal and only responsible for a small percentage of the CO2 increase in the atmosphere – but without explicitly saying that. In my view the authors of this piece knew that this idea is plain wrong, so they did not say it but preferred to insinuate it. A recent publication by Geoffrey Supran und Naomi Oreskes in Environmental Research Letters has systematically assessed ExxonMobil’s climate change communications during 1977–2014 and found:

We conclude that ExxonMobil contributed to advancing climate science—by way of its scientists’ academic publications—but promoted doubt about it in advertorials. Given this discrepancy, we conclude that ExxonMobil misled the public.

They explain their main findings in this short video clip.

Does the CO2 come from volcanoes?

Another age-old climatic skeptic myth, is that the CO2 is coming from volcanoes – first time I had to rebut this was as a young postdoc in the 1990s. The total volcanic emissions are between 0.04 and 0.07 gigatonnes of CO2 per year, compared to the anthropogenic emissions of 12 gigatons in 2016. Anthropogenic emissions are now well over a hundred times greater than volcanic ones. The volcanic emissions are important for the long-term CO2 changes over millions of years, but not over a few centuries.

Does the CO2 come from the ocean?

As already mentioned and shown in Figs. 4 and 5, the oceans absorb net CO2 and do not release any. The resulting increase in CO2 in the upper ocean is documented and mapped in detail by countless ship surveys and known up to a residual uncertainty of + – 20% . This is, in itself, a very serious problem because it leads to the acidification of the oceans, since CO2 forms carbonic acid in water. The observed CO2 increase in the world ocean disproves another popular #fakenews piece of the “climate skeptics”: namely that the CO2 increase in the atmosphere might have been caused by the outgassing of CO2 from the ocean as a result of the warming. No serious scientist believes this.

Remember also from Figs. 4 and 5 that we emit about twice as much CO2 as is needed to explain the complete rise in the atmosphere. In case you have not connected the dots: the denier myth of the oceans as cause of the atmospheric CO2 rise most often comes in the form of “the CO2 rise lagged behind temperature rise in glacial cycles”. It is true that during ice ages the oceans took up more CO2 and that is why there was less in the atmosphere, and during the warming at the end of glacial cycles that CO2 came back out of the ocean, and this was an important amplifying feedback. But it is a fallacy to conclude that the same natural phenomenon is happening again now. As I explained above: measurements clearly prove that the modern CO2 rise has a different cause, namely our fossil fuel use. What is the same now and over past glacial cycles is not the CO2 source, but the greenhouse effect of the atmospheric CO2 changes:  without that we could not understand (or correctly simulate in our climate models) the full extent of glacial cycles.

The cyanide cocktail

A man offers you a cocktail with a little bit of cyanide at a party. You reject that indignantly, but the man assures you it is completely safe: after all, the amount of cyanide in your body  after this drink would be only 0.001 percent! This could hardly be harmful! Those scientists who claim that 3 mg cyanide per kg of body weight (ie 0.0003 percent) are fatal are obviously not to be trusted. Are you falling for that argument?

We hope not, and we hope you will neither fall for the claim that 0.0125 percent of CO2 (that’s the 125 ppm increase caused by humans) can’t be bad because that number is small. Of course, the amount of CO2 in the air could also be expressed in kilograms: it is 3200 billion tons or 3,200,000,000,000,000 kilograms. Of this humans are responsible for almost 1000 billion tons. (Does that sound more harmful than 0.0125 percent?) Since the year 1870, we have even emitted a total of about 2,000 billion tons. As already explained, forests and oceans have removed about half of that from the atmosphere.

Scientists specify the concentration of individual gases in the atmosphere as volume fractions (rather than, e.g., grams per cubic meter of air) because then the numbers do not depend on temperature and pressure, which vary greatly in the atmosphere. As far as climatic impact is concerned, however, the fraction of the total mass of the atmosphere is irrelevant since the atmosphere consists of 99.9% nitrogen, oxygen and argon, i.e. gases which cannot absorb infrared radiation. Only molecules made of at least three atoms absorb heat radiation and thus only such trace gases makes the greenhouse effect, and among these CO2 is the second most important after water vapor. All this has been known since John Tyndall’s measurements of the greenhouse effect of various gases in 1859. Tyndall back then wrote:

[T]he atmosphere admits of the entrance of the solar heat, but checks its exit; and the result is a tendency to accumulate heat at the surface of the planet.

That is still a great concise description of the greenhouse effect! Without CO2 in the air our planet would be completely frozen, no life would be possible. With CO2, we are turning one of the major control knobs of global climate.

The climate effect

So let’s finally come to the climatic effect of the CO2 increase. As for cyanide, the effect is what counts, and not whether compared to some large mass the fraction is 10 percent or 0.01 percent. The dose effect of toxins on humans can be determined from experience with victims. The climatic impact of greenhouse gases can either be calculated on the basis of an understanding of the physical processes, or it can be determined from the experience of climate history (see my previous post). Both come to the same conclusion. The climate sensitivity (global warming in equilibrium after CO2 doubling) is around 3 °C, and the expected warming to date, due to the current CO2 increase, is around 1 °C. This corresponds quite exactly to the observed global warming (Fig. 7). For which, by the way, there is no natural explanation, and the best estimate for the anthropogenic share of global warming since 1950 is 110 percent – more on this in my previous post.

Fig. 7 Time evolution of global temperature, CO2 concentration and solar activity. Temperature and CO2 are scaled relative to each other as the physically expected CO2 effect on the climate predicts (i.e. best estimate of the climate sensitivity). The amplitude of the solar curve is scaled as derived from the observed correlation of solar and temperature data. (Details are explained here ). This graph can be created here and you can copy a code that can be used as a widget in any website (as in my home page), where it is automatically updated every year with the latest data. Thanks to Bernd Herd who programmed this.

Finally, here is a slick new video clip illustrating the history of CO2 emissions on the map:

A Brief History of CO2 Emissions - YouTube

Links

Physics Today: The carbon cycle in a changing climate

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A recent story in the Guardian claims that new calculations reduce the uncertainty associated with a global warming:

A revised calculation of how greenhouse gases drive up the planet’s temperature reduces the range of possible end-of-century outcomes by more than half, …

It was based on a study recently published in Nature (Cox et al. 2018), however, I think its conclusions are premature.

The calculations in question involved both an over-simplification and a set of assumptions which limit their precision, if applied to Earth’s real climate system.

They provide a nice idealised and theoretical description, but they should not be interpreted as an accurate reflection of the real world.

There are nevertheless some interesting concepts presented in the analysis, such as the connection between climate sensitivity and the magnitude of natural variations.

Both are related to feedback mechanisms which can amplify or dampen initial changes, such as the connection between temperature and the albedo associated with sea-ice and snow. Temperature changes are also expected to affect atmospheric vapour concentrations, which in turn affect the temperature through an increased greenhouse effect.

However, the magnitude of natural variations is usually associated with the transient climate sensitivity, and it is not entirely clear from the calculations presented in Cox et al. (2018) how the natural variability can provide a good estimate of the equilibrium climate sensitivity, other than using the “Hasselmann model” as a framework:

(1)  

Cox et al. assumed that the same feedback mechanisms are involved in both natural variations and a climate change due to increased CO2. This means that we should expect a high climate sensitivity if there are pronounced natural variations.

But it is not that simple, as different feedback mechanisms are associated with different time scales. Some are expected to react rapidly, but others associated with the oceans and the carbon cycle may be more sluggish. There could also be tipping points, which would imply a high climate sensitivity.

The Hasselmann model is of course a gross simplification of the real climate system, and such a crude analytical framework implies low precision for when the results are transferred to the real world.

To demonstrate such lack of precision, we can make a “quick and dirty” evaluation of how well the Hasselmann model fits real data based on forcing from e.g. Crowley (2000) through an ordinary linear regression model.

The regression model can be rewritten as , where , , and . In addition, and are the regression coefficients to be estimated, and is a constant noise term (more details in the R-script used to do this demonstration).

Figure 1. Test of the Hasselmann model through a regression analysis, where the coloured curves are the best-fit modelled values for Q based on the Hasselmann model and global mean temperatures (PDF).

It is clear that the model fails for the dips in the forcing connected volcanic eruptions (Figure 1). We also see a substantial scatter in both (some values are even negative and hence unphysical) and (Figure 2).

Figure 2. The regression coefficients. Negative values for C are unphysical and suggest that the Hasselmann model is far from perfect. The estimated error margins for C are substantial, however, and also include positive values. Blue point shows the estimates for NCEP/NCAR reanalysis. The shaded areas cover the best estimates plus/minus two standard errors (PDF).

The climate sensitivity is closest associated with , for which the mean estimate was 1.11, with a 5-95-percentile interval of 0.74-1.62.

We can use these estimates in a naive attempt to calculate the temperature response for a stable climate with and a doubled forcing associated with increased CO2.

It’s plain mathematics. I took a doubling of 1998 CO2-forcing of 2.43 from Crowley (2000), and used the non-zero terms in the Hasselmann model, .

The mean temperature response to a doubled CO2-forcing for GCMs was 2.36, with a 90% confidence interval: 1.5 – 3.3. The estimate from reanalysis was 1.71

The true equilibrium climate sensitivity for the climate models used in this demonstration is in the range 2.1 – 4.4 , and the transient climate sensitivity is 1.2 – 2.6 (IPCC AR5, Table 8.2).

This demonstration suggests that the Hasselmann model underestimates the climate sensitivity and the over-simplified framework on which it is based precludes high precision.

Another assumption made in the calculations was that the climate forcing Q looks like a white noise after the removal of the long-term trends.

This too is questionable, as there are reasons to think the ocean uptake of heat varies at different time scales and may be influenced by ENSO, the Pacific Decadal Oscillation (PDO), and the Atlantic Multi-decadal Oscillation (AMO). The solar irradiance also has an 11-year cycle component and volcanic eruptions introduce spikes in the forcing (see Figure 1).

Cox et al.’s calculations were also based on another assumption somewhat related to different time scales for different feedback mechanisms: a constant “heat capacity” represented by C in the equation above.

The real-world “heat capacity” is probably not constant, but I would expect it to change with temperature.

Since it reflects the capacity of the climate system to absorb heat, it may be influenced by the planetary albedo (sea-ice and snow) and ice-caps, which respond to temperature changes.

It’s more likely that C is a non-linear function of temperature, and in this case, the equation describing the Hasselmann model would look like:

(2)  

Cox et al.’s calculations of the equilibrium climate sensitivity used a key metric which was derived from the Hasselmann model and assumed a constant C: . This key metric would be different if the heat capacity varied with temperature, which subsequently would affect the end-results.

I also have an issue with the confidence interval presented for the calculations, which was based on one standard deviation . The interval of represents a 66% probability, and can be illustrated with three numbers: and two of them are “correct” and one “wrong”, which means there is a 1/3 chance that I pick the “wrong” number if I were to randomly pick one of the three.

To be fair, the study also stated the 90% confidence interval, but it was not emphasised in the abstract nor in the press-coverage.

One thing that was not clear, was whether the analysis, that involved both observed temperatures from the HadCRUT4 dataset and global climate models, took into account the fact that the observations do not cover 100% of Earth’s surface (see RC post ‘Mind the Gap!’).

A spatial mask would be appropriate to ensure that the climate model simulations provide data for only those regions where observations exists. Moreover, it would have to change over time because the thermometer observations have covered a larger fraction of Earth’s area with time (see Figure 3).

An increase in data coverage will affect the estimated variance and one-year autocorrelation associated with the global mean temperature, which also should influence the the metric .

Figure 3. The area of Earth’s surface with valid temperature data (PDF).

My last issue with the calculations is that the traditional definition of climate sensitivity only takes into account changes in the temperature. However, there is also a possibility that a climate change involves a change in the hydrological cycle. I have explained this possibility in a review of the greenhouse effect (Benestad, 2017), and this possibility would add another term the equation describing the Hasselmann model.

I nevertheless think the study is interesting and it is impressive that the results are so similar to previously published results. However, I do not think the results are associated with the stated precision because of the assumptions and the simplifications involved. Hence, I disagree with the following statement presented in the Guardian:

These scientists have produced a more accurate estimate of how the planet will respond to increasing CO2 levels

References
  1. P.M. Cox, C. Huntingford, and M.S. Williamson, "Emergent constraint on equilibrium climate sensitivity from global temperature variability", Nature, vol. 553, pp. 319-322, 2018. http://dx.doi.org/10.1038/nature25450
  2. T.J. Crowley, "Causes of Climate Change Over the Past 1000 Years", Science, vol. 289, pp. 270-277, 2000. http://dx.doi.org/10.1126/science.289.5477.270
  3. R.E. Benestad, "A mental picture of the greenhouse effect", Theoretical and Applied Climatology, vol. 128, pp. 679-688, 2016. http://dx.doi.org/10.1007/s00704-016-1732-y
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This is a thread to discuss the surface temperature records that were all released yesterday (Jan 18). There is far too much data-vizualization on this to link to, but feel free to do so in the comments. Bottom line? It’s still getting warmer.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This is a new class of open thread for discussions of climate solutions, mitigation and adaptation. As always, please be respectful of other commentators and try to avoid using repetition to make your points. Discussions related to the physical Earth System should be on the Unforced Variations threads.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Happy new year, and a happy new open thread.

In response to some the comments we’ve been getting about previous open threads, we are going to try separating out OT comments on mitigation/saving the planet/theories of political action from ones related to the physical climate system. This thread remains a place for climate science issues, questions, & news, but we have started a new Forced Responses thread where people can more clearly discuss mitigation issues. We realise that sometimes it can be hard to cleanly separate these conversations, but hopefully folk can try that out as a new year’s resolution!

Note we will be updating the Model/Data comparisons over the next few weeks as the various observational data sets get updated for calendar year 2017. The main surface temperature datasets will be released around Jan 18.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

If you think you know why NASA did not report the discovery of the Antarctic polar ozone hole in 1984 before the publication of Farman et al in May 1985, you might well be wrong.

One of the most fun things in research is what happens when you try and find a reference to a commonly-known fact and slowly discover that your “fact” is not actually that factual, and that the real story is more interesting than you imagined…

Here is the standard story (one I’ve told repeatedly myself): The publication in 1985 by scientists from the British Antarctic Survey working at Halley Station (right) of observations of extremely low ozone values in Oct 1983 (SH springtime) came as a huge shock to the scientific community. Given that NASA had been monitoring ozone by satellite using the NIMBUS instruments since the late 1970s, people were surprised that this had not been reported already. NASA scientists went back to the satellite data and found that anomalously low values had been rejected as bad data and were not included in the analyses. After reprocessing the data with this flag removed, the trends became clear and the confirmation of ground-based data was reported in the NY Times in Nov 1985 and published formally the next year (Stolarski et al., 1986).

This is mostly true, but not quite…

It is true that the Quality Control (QC) flag on the retrieval was set whenever the inferred ozone level dropped below 180 Dobson Units [1 DU is equivalent to a 0.1mm thick pure ozone layer at standard temperature and pressure]. Prior to 1983, there had never been an observation below 200 DU and so values lower than 180 DU were out of calibration range for the sensor. These calibrations rely on the actual atmospheric profiles being relatively close to pre-defined standards so that deviations in the observed radiances are small, and you can assume quasi-linearity. Deviations that are too large can come from multiple causes and thus are more uncertain to interpret. The absence of a very low ozone profile in the calibration was thus inherently limiting.

However, it wasn’t true that no-one at NASA had noticed.

Satellite anomalies

The processing of the Oct 1983 data was started in August 1984 by the Ozone Processing Team at Goddard Space Flight Center led by A. Fleig and including Donald Heath and P.K. Bhartia. In statements from Bhartia and Richard McPeters, it seems clear that the large increase in flagged data (points that were nominally below the QC level) in October 1983 was noticed and investigated. The first explanation for a large increase in anomalous data is almost always that there is something wrong with the sensor, and the first check on that is comparing the retrievals with whatever ground truth is available. The only publically available real-time Antarctic ozone data at the time was from South Pole (the BAS data was not publically accessible), and that showed polar ozone values of ~300 DU in Oct 1983, casting doubt on the anomalously low satellite retrievals. Nonetheless, retrieved values from the rest of the world outside of the polar vortex were normal so the puzzle remained. Additional data had been available from the Japanese Syowa station for 1982 which would have been helpful, but its publication (in December 1984) was not widely appreciated at the time.

By December 1984, the OPT team was confident enough that the data was real that they submitted an abstract for a conference to be held in Prague in August 1985. The title was “Observation of anomalously small ozone densities in south polar stratosphere in October 1983 and pre-1984” (P.K. Bhartia et al) and the data that they’d generated (in 1984) included the following map:

However, before this had been publically presented, the Farman et al paper was published in Nature in May 1985.

Halley Station reports



The clear decrease in October ozone values seen at Halley Station (in 1983, falling below 200 DU for the first time), and the correlation to the increasing CFC concentrations (plotted inversely in the figure above), was undoubtedly dramatic.

It wasn’t however until Nov 1985 after a workshop that the first media report (in the NY Times) showed the NASA results (publishing another Oct 1983 map for a slightly different day). That article was notably the first public use the term “hole” to describe the feature (though this term had been coined by Sherwood Rowland some months earlier).

The article is also notable for reflecting the uncertainty that existed at the time about the cause of the anomaly. Farman et al had suggested strongly that increased chlorine loads in the stratosphere were causing the depletion, but two alternate theories were still credible – a dynamical theory based on anomalous upwelling of (relatively depleted) tropospheric air and a solar-cycle related cause.

It wasn’t until August 1986 that the first ‘proper’ NASA publication on the NIMBUS trends was published (Stolarski et al., 1986). This data had been processed with additional low DU profiles as part of the calibration and showed clearly the long term trend down across the polar vortex:

and specifically collaborated the low (sub 200 DU) Oct 1983 values seen at Halley Station (dots):

Also in 1986, the South Pole group reported that the preliminary observations from Oct-Dec 1983 had been invalid – the wrong channels had been read on the instrument – a problem that certainly slowed the NASA reporting.

Further field work by NASA during the 1987 Airborne Antarctic Ozone Experiment (AAOE) ended up providing definitive evidence in favor of the chlorine hypothesis, with details of the heterogeneous chemistry on polar stratospheric clouds as hypothesized by Susan Solomon and colleagues in 1986.

Summary

It seems to me that the extra details provided by McPeters and Bhartia (which have been available for many years, though perhaps have not been widely read), and the figure from 1984, change the nature of this story. It isn’t a simple tale of over-confidence in algorithms in the face of black swan events, but rather a tale of poor communications and siloed researchers that slowed down the ability of the wider community to see and interpret what was going on. In some sense this doesn’t matter – enough people were looking at ozone in the polar vortex that if Farman hadn’t reported this, NASA, the South Pole group or the Japanese would have seen it soon enough with similar impacts on the Montreal Protocol negotiations. But the barriers to rapid communication certainly slowed down the community response. For instance, it has been reported that the Farman group had attempted to contact people at NASA prior to publication, but their letters were not sent to the right folk and were never received.

What if this were to happen today?

Today, satellite-retrieved ozone data (like much NASA remote sensing) is available in real time. Dramatic increases in ‘bad’ data would be obvious to many right away. With email and (perhaps) a more open scientific culture, people looking at instruments on the ground would have been able to telegraph concerns to the right people working on satellites and other ground stations to confirm their observations much earlier. It seems likely that the initial publications would have been joint efforts (or at least coordinated), even if attribution of the change would still have been contentious.

None of the above should be taken as trying to diminish the work of Farman and colleagues whose ‘old school’ brand of observational science certainly paid off, but a result is far more powerful when seen in multiple independent records.

Further reading/viewing

First-hand descriptions of the NASA effort are available as “MANIAC” (auto-biographical) talks from Paul Newman, P.K. Bhartia and Richard Stolarski at GSFC:

Paul Newman Maniac Lecture, February 25, 2015 - YouTube

Pawan K. Bhartia Maniac Lecture, August 27, 2014 - YouTube

Richard Stolarski Maniac Lecture, April 22, 2015 - YouTube

And some further reading here on the role of imagery and metaphor in the ozone “hole” discussion.

References
  1. J.C. Farman, B.G. Gardiner, and J.D. Shanklin, "Large losses of total ozone in Antarctica reveal seasonal ClO x /NO x interaction", Nature, vol. 315, pp. 207-210, 1985. http://dx.doi.org/10.1038/315207a0
  2. R.S. Stolarski, A.J. Krueger, M.R. Schoeberl, R.D. McPeters, P.A. Newman, and J.C. Alpert, "Nimbus 7 satellite measurements of the springtime Antarctic ozone  decrease", Nature, vol. 322, pp. 808-811, 1986. http://dx.doi.org/10.1038/322808a0
  3. P.K. Bhartia, "Role of Satellite Measurements in the Discovery of Stratospheric Ozone Depletion", Twenty Years of Ozone Decline, pp. 183-189, 2009. http://dx.doi.org/10.1007/978-90-481-2469-5_13
  4. W.D. Komhyr, R.D. Grass, and R.K. Leonard, "Total ozone decrease at South Pole, Antarctica, 1964-1985", Geophysical Research Letters, vol. 13, pp. 1248-1251, 1986. http://dx.doi.org/10.1029/GL013i012p01248
  5. S. Solomon, R.R. Garcia, F.S. Rowland, and D.J. Wuebbles, "On the depletion of Antarctic ozone", Nature, vol. 321, pp. 755-758, 1986. http://dx.doi.org/10.1038/321755a0
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
RealClimate by Group - 2M ago

It’s that time of year again. #AGU17 is from Dec 11 to Dec 16 in New Orleans (the traditional venue in San Francisco is undergoing renovations).

As in previous years, there will be extensive live streams from “AGU On Demand” (free, but an online registration is required) of interesting sessions and the keynote lectures from prize-winners and awardees.

Some potential highlights will be Dan Rather, Baba Brinkman, and Joanna Morgan. The E-lightning sessions are already filled with posters covering many aspects of AGU science. Clara Deser, Bjorn Stevens, David Neelin, Linda Mearns and Thomas Stocker are giving some the key climate-related named lectures. The Tyndall Lecture by Jim Fleming might also be of interest.

As usual there are plenty of sessions devoted to public affairs and science communication, including one focused on the use of humour in #scicomm (on Friday at 4pm to encourage people to stay to the end I imagine), and a workshop on Tuesday (joint with the ACLU and CSLDF) on legal issues for scientist activists and advocates.

AGU is also a great place to apply for jobs, get free legal advice, mingle, and network.

A couple of us will be there – and we might find time to post on anything interesting we see. If any readers spot us, say hi!

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last open-thread of the year. Tips for new books for people to read over the holidays? Highlights of Fall AGU (Dec 11-15, New Orleans)? Requests for what should be in the end of year updates? Try to be nice.

Read Full Article
Visit website

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free year
Free Preview