Scientists Find Observed Satellite Temperature Data Sets from Three Independent Research Groups Have Less Than a One-in-3.5 Million Chance of Occurring in the Absence of Human-Induced Global Warming

5-sigma
Figure 1 from Santer, et al. (2019). Signal-to-noise ratios used for identifying a model-predicted anthropogenic fingerprint in 40 years of satellite measurements of annual-mean tropospheric temperature. The blue, red, and green lines are signal-to-noise ratios of tropospheric temperature data derived from microwave sounding units on National Oceanic and Atmospheric Administration (NOAA) polar-orbiting satellites since 1979 by three different research groups: Remote Sensing Systems (RSS), the Center for  Satellite Applications and Research (STAR), and the University of Alabama at Huntsville (UAH). The 3σ line represents a temperature 3 standard deviations above the average temperature of early measurements; based on random variation alone, a single measurement greater than 3σ from the average would only occur 3 times in every 1000 measurements. The 5σ line represents a temperature 5 standard deviations above the average of early measurements; based on random variation alone, a single measurement greater than 5σ from the average would occur no more than once in 3.5 million measurements. Two of the data sets had surpassed the 5σ threshold by 2005, and all three data sets had surpassed the 5σ threshold by 2016.

Are you a gambler? How about if I said we can go ahead and continue with “business as usual” — drill, baby, drill; beautiful, clean coal — without any of the feared consequences of rising sea levels, persistent drought, intense weather events, lost biodiversity, human refugee crises, and possible future societal collapsewith a one-in-3.5 million probability of success?

Sound good?

Well, the chances are just a little bit less than that.

On Monday, a group of 11 scientists from the United States, Canada, and Scotland published a paper in the peer reviewed and competitive journal, Nature Climate Change, reviewing key accomplishments in the past 40 years of climate science since 1979.

The first was the 1979 publication of a 22-page report by the National Research Council, known informally as the “Charney Report” after the meteorologist, Jule Charney, who chaired the “ad hoc study group on carbon dioxide and climate” that produced the report attempting to synthesize all available climate research to date, identify gaps in understanding for further study, and make preliminary predictions about the extent and effects of expected global warming. Though many gaps in understanding were identified, the group’s essential prediction has aged quite well, being consistent with both earlier 1956 calculations by the physicist, Gilbert Plass (using some of the world’s first computers), and current predictions (see figure below).

3 articles v2
Basic scientific prediction of global warming over 6 decades:
• “The most recent calculations of the infra-red flux in the region of the 15 micron COband show that the average surface temperature of the earth increases 3.6°C if the COconcentration in the atmosphere is doubled…” (G. Plass, 1956)
• “We estimate the most probable global warming for a doubling of CO2 to be near 3°C with a probable error of ± 1.5°C.” (National Research Council, 1979)
• “The equilibrium climate sensitivity [global average surface warming with doubling of COconcentration relative to pre-industrial] is likely in the range of 1.5°C to 4.5°C, extremely unlikely less than 1°C, and very unlikely greater than 6°C.” (IPCC, 2014)
The claim often made or implied in popular discourse, that scientists have “changed their story” on climate change (e.g.which is it, climate change or global warming?…), is belied by the decades-long consistency of the above basic prediction, initially made simply based on a physical understanding of the absorption of infrared radiation by COgas.

The second accomplishment reviewed in the article was the 1979 publication of a landmark paper by Klauss Hasselmann entitled, “On the signal-to-noise problem in atmospheric response studies.” Ideas in this paper led scientists afterward to test the “fingerprints” of various hypotheses about the external causes of observed climate change signals versus random noise in the climate system. From these initial ideas sprung an entire discipline of climate science that has resulted in the testing of numerous proposed hypotheses of external climate influence vs. random variations in observed climate data.

The third accomplishment reviewed was the implementation, since 1979, of microwave sounding units on NOAA polar-orbiting satellites which measure microwave emissions from oxygen molecules in earth’s atmosphere that are proportional to temperature. The results over decades from three independent research groups, analyzing this data, are shown at the top of this post.

In the new paper, the authors note the confluence of these 3 events 40 years ago. The Charney report of 1979 analysed the best available scientific data to date and made a bold prediction about future global warming on a “business as usual trajectory,” admitting significant uncertainties. The Hasselmann paper of 1979 suggested an approach of comparing various hypotheses and their expected resulting temperature changes with those due to random noise in the climate system. The activation in 1979 of satellite systems capable of measuring atmospheric temperature provided this framework with data.

As of Monday, this work has come to fruition in a compelling way. Using methods derived from Hasselmann’s, scientists analyze the satellite temperature data, showing that random climate noise alone would generate a single measurement of the past 3 annual measurements, from 3 independent research groups, at most one out of every 3.5 million times. This is a level of certainty known in science as a 5σ threshold. Unless you’re a big time gambler, this is a pretty sure thing.

For example, the 2012 observation of data surpassing the 5σ threshold at the Large Hadron Collider, a 17-mile long particle-smashing tunnel surrounded by 9,000 superconducting magnets, built at an expense by over $10 billion and operated by thousands of scientists from dozens of nations, convinced all the world’s physicists of the existence of the mathematically predicted Higgs Boson, a rarely produced but real particle that explains how things have mass. Once data surpassing the 5σ threshold was observed, virtually all physicists believed in the existence of the Higgs Boson and its attendant theory of mass; there were no deniers. The data was just too compelling.

And, as I’ve posted on before, explanations other than anthropogenic global warming fail to fit the climate data at all well. The only explanation that does fit the data, which we now know would occur randomly less than once in 3.5 million times, is this — we burn fossil fuels, which introduces ancient carbon dioxide back into the atmosphere, which increases the atmosphere’s retention of reflected infrared radiation from the sun.

The prediction has remained consistent for over 6 decades, even as the evidence supporting it has piled up to the point of statistical certitude. The reason for the success of the early prediction, based only on math, a knowledge of carbon dioxide’s infrared absorption, and the world’s first computers, lies in the fact that carbon dioxide, practically alone, drives the climate change we have experienced since the Industrial Revolution.

“We know, beyond a shadow of a doubt, that human activities have changed the composition of Earth’s atmosphere. And we know that these human‐caused changes in the levels of greenhouse gases make it easier for the atmosphere to trap heat. This is not rocket science. It is simple, basic physics.”

Dr. Benjamin D. Santer, atmospheric scientist, Lawrence Livermore National Laboratory, in a testimony to the U.S. Congress, May 20, 2010

“An anthropogenic fingerprint of tropospheric warming is identifiable with high statistical confidence in all currently available satellite datasets … In two out of three datasets, fingerprint detection at a 5σ threshold — the gold standard for discoveries in particle physics — occurs no later than 2005, only 27 years after the 1979 start of the satellite measurements. Humanity cannot afford to ignore such clear signals.”

Benjamin D. Santer, Celine J. W. Bonfils, Qiang Fu, John C. Fyfe, Gabriele C. Hegerl, Carl Mears, Jeffrey F. Painter, Stephen Po-Chedley, Frank J. Wentz, Mark D. Zelinka & Cheng-Zhi Zhou, authors of the paper

#rescuethatfrog

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.