Scientific evidence of global climate change: A brief history

This page takes us back to the beginnings of climate science, to look at the early evidence for Global Climate Change and how that evidence has developed into our current state of knowledge. In the tradition of this fact-based website, there are links to the original research.

This is not a comprehensive examination; other valuable histories of climate science are available (here, here, and here, for example). My website being a fact-rich zone, the “twist” I have chosen is to focus this history on the practical measurements. Prior to the history reported here and beginning in 1824, Joseph Fourier (French mathematician), Claude Pouillet (French physicist), John Tyndall (Irish physicist), Svante Arrhenius (Swedish physical chemist), and Thomas Chamberlin (American geologist) had proposed and developed a hypothesis that increases in atmospheric carbon dioxide, caused by the burning of fossil fuels, could cause the surface temperature of the Earth to rise due to what we now know as the Greenhouse Effect. You can read about that elsewhere. Their assertions were disputed by other scientists, at the time, based on a number of arguments. That water vapor was a much stronger absorber of solar radiation than carbon dioxide. That any excess carbon dioxide from fossil fuels would be rapidly absorbed by the vast oceans. Etc. The speculation was all well-informed (based on the available data at the time) but at an impasse

And here we begin.

Quick links to page contents
Episode 1. 1900, Royal Botanical Gardens
Episode 2. First measurement of anthropogenic global warming
Episode 3. Our “large scale geophysical experiment” (1940-1960)
Episode 4. Dave Keeling persists in a great idea
Episode 5. Icy time capsules
Episode 6. The “geologic eons of time”
Episode 7. Our global thermometer since 1850

Episode 1. 1900, Royal Botanical Gardens. Two British scientists’ adventures with leaves and CO2 measurements.

In the years between 1898 and 1901, Dr. Horace Brown, a British chemist, and Mr. Fergusson Escombe, a British botanist, were at the Royal Botanical Gardens in Kew, England studying the influence of light and carbon dioxide levels on the rate of the photosynthesis reaction in leaves (Brown & Escombe, 1905). They constructed a rather ingenious apparatus:

Figure 1 of H. T. Brown & F. Escombe, On the physiological processes of green leaves; Proceedings of the Royal Society B 76 (1905), 29-111.

A leaf was housed inside a sealed box with a window. Air with a known carbon dioxide concentration could be pulled through the box while light of a measured intensity was made to shine on the window. The air from the leaf box was then pulled into the chemical apparatus on the right, within which the amount of carbon dioxide remaining in the air was measured by its reaction with sodium hydroxide to form sodium carbonate. This was a new method of measuring the concentration of carbon dioxide in air, at the time, and Brown & Escombe were pleased to find it was accurate enough to discern the small quantities of carbon dioxide consumed by a single leaf as it did photosynthesis. Naturally, Brown & Escombe had occasion in the course of this work to make a multitude of measurements of the carbon dioxide concentration in the ambient air at the Royal Botanical Gardens. It averaged about 290 ppm (parts per million).

So it was around 1900, and the atmospheric carbon dioxide concentration at the Royal Botanical Gardens was 290 ppm.

Back to page contents

Episode 2. First measurement of anthropogenic global warming

Guy Callendar, a British steam engineer and inventor, referenced Brown & Escombe’s atmospheric carbon dioxide measurements four decades later in his paper (Callendar, 1938), famous in climate science, which opened with the following sensational claim:

“Few of those familiar with the natural heat exchanges of the atmosphere, which go into the making of our climates and weather, would be prepared to admit that the activities of man could have any influence upon phenomena of so vast a scale. In the following paper I hope to show that such influence is not only possible, but is actually occurring at the present time.”

(A note of foreshadowing: As we continue in our pursuit of knowledge about climate science, it may become astounding to realize the quote above, from the year 1938, quite resembles the state of very recent “debate” that occurred on the floor of our 2015 U.S. Senate. Article about Senate absurdity. Video of Senate absurdity.)

In his peer-reviewed 1938 paper, Callendar made use of a number of other scientific studies that had taken place since around the turn of the 20th century, which he believed for the first time enabled a reasonable calculation of the effect on Earth’s temperature of CO2 increases from the burning of fossil fuels:

  • More accurate measurements of infrared absorption by CO2 (Rubens & Aschkinass, 1898);
  • The temperature-pressure-alkalinity-CO2 relation for seawater (C. J. Fox, 1909);
  • Measurements of atmospheric radiation of heat (A. Angstrom, 1918; W. H. Dines, 1927; G. C. Simpson, 1928; D. Brunt, 1932);
  • Infrared absorption measurements of water vapor (F. E. Fowle, 1918).

Callendar had the benefit of more atmospheric CO2 measurements that had been taken in the eastern U.S. between 1930 and 1936. These averaged 310 ppm, about 6% higher than the earlier measurements at the Royal Botanical Gardens around 1900. Taking into consideration better estimates of the expected absorption of CO2 by the oceans, Callendar calculated that a 6% increase was about consistent with the estimated addition of CO2 to the atmosphere by the combustion of fossil fuels (about 4,500 million tons per year at the time). Most of the added CO2 seemed to be staying airborne.

Taking account of infrared absorption by both CO2 and water vapor, downward radiation of absorbed heat from the sky, and the effect of this on surface temperature, Callendar calculated that Earth’s temperature at the surface should be increasing at the rate of about 0.003 degrees Celsius per year.

Callendar then undertook a staggering project of collecting, sorting, analyzing, and averaging measured temperatures from hundreds of global weather stations that had been collected since about 1880 (earlier standardized records did not exist). It’s frankly hard for me to imagine doing this overwhelming project, as he did, without even a calculator. He summarized his findings in this graph:

Figure 4 from G. S. Callendar, The artificial production of carbon dioxide and its influence on temperature; Quarterly Journal of the Royal Meteorological Society 64 (1938), 223-240.

In all 3 major climate zones of the Earth in which temperature records existed, Callendar found the temperature variation, with respect to the 1901-1930 mean temperature, to be remarkably consistent. Everywhere on the Earth, the temperature had increased, over approximately the previous half-century, at an average rate of 0.005 degrees Celsius per year, a somewhat greater increase than he had calculated based on the CO2 increases. But he admitted the temperature record was rather short in duration, and further observation was warranted.

Interestingly, Callendar remarked at the end of his paper that he thought global warming resulting from the combustion of fossil fuels would be beneficial by preventing “the return of the deadly glaciers” (referring, it would seem, to the ice ages). Writing as he was in 1938, and only having observed the first glimmer of Global Climate Change, he can be forgiven for underestimating the future enthusiasm with which we would burn fossil fuels. By the end of it, we may find ourselves nostalgic for the glaciers we have now.

Back to page contents

Episode 3. Our “large scale geophysical experiment” (1940-1960)

Climate enthusiast Guy Callendar continued to find time, around his day job as a steam engineer, to conduct and publish multiple research studies between 1940 and 1955, proposing increasing evidence of a linkage between fossil fuel use, rising atmospheric CO2 concentration, and warming global surface temperature (G. Callendar, 1940, 1941, 1942, 1944, 1948, 1949, 1952, 1955). In these, Callendar continued to refine estimates of infrared absorption by CO2, catalog CO2 and temperature measurements in various regions during the period since 1850, and refine and update his calculations of the total amount of CO2 that had been produced globally by fossil fuel use. His analyses continued to suggest that most of the CO2 produced by fossil fuel combustion had directly increased the CO2 concentration of the atmosphere.

During this period, Callendar’s influential 1938 paper also served to renew the interest of other scientists in the possibility of anthropogenic global warming. Roger Revelle and Hans Suess, at the Scripps Institution of Oceanography (UC San Diego), summed up the growing interest in the subject particularly well (Revelle & Suess, 1957):

“. . . human beings are now carrying out a large scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future. Within a few centuries we are returning to the atmosphere and oceans the concentrated organic carbon stored in sedimentary rocks over hundreds of millions of years. This experiment, if adequately documented, may yield a far-reaching insight into the processes determining weather and climate.”

Gilbert Plass, a Canadian-born physicist working in the U.S., published a series of papers in 1956 (G. N. Plass, 1956a, 1956b, 1956c, 1956d) in which he brought increased rigor to the calculation of infrared absorption by carbon dioxide in the atmosphere, aided by the new availability of high speed computers to perform complex calculations. These calculations proved wrong a widely held belief at the time, that water vapor absorbed infrared radiation from the Earth’s surface more strongly than carbon dioxide and thus controlled the “greenhouse effect.” With improved calculations, Plass showed that water vapor and carbon dioxide absorbed radiation mainly in different parts of the infrared spectrum. Also, water vapor was present primarily in the region of the atmosphere right next to the Earth’s surface, whereas carbon dioxide was present uniformly at all heights. The new calculations added physical rigor to the theory that the atmospheric carbon dioxide level strongly influences the Earth’s surface temperature. Plass calculated that a doubling of the atmospheric carbon dioxide level would lead to a temperature increase of 3.6 degrees Celsius, and that continued use of fossil fuels would cause about a 1 degree Celsius temperature increase by the year 2000, at which time we would experience easily-observed effects of climate change. As we will see, these 1956 predictions have proven remarkably accurate.

But it was not all agreement during this period. In the tradition of the scientific method, other scientists were questioning the above conclusions. Giles Slocum, a scientist at the U.S. Weather Bureau, pointed out that Callendar’s claim of increasing atmospheric CO2 relied heavily on his selection of particular historical measurements he deemed more accurate than others (G. Slocum, 1995). Slocum’s criticism was illustrated quite well by Stig Fonselius and his coworkers, operators of a network of Scandinavian CO2 measurement sites that had been set up in 1954. Fonselius, et al. (1956) cataloged a large number of CO2 measurements that had been made since the early 1800’s and prepared this graph:

Figure 1 from Fonselius, et al., 1956. The circled values are those selected by Callendar as well as those recorded in 1955 by the Scandinavian network.

As you can easily see, anyone taking the totality of the data as the CO2 record would be hard pressed to argue there had been an obvious increase over time. Callendar had argued in his papers that many of the measurements, particularly early ones, had been conducted with poor equipment and/or at locations, like the middle of large cities, likely to display elevated CO2 levels due to local sources of CO2 pollution (factories, etc.) While nobody disputed that many CO2 measurements had probably been inaccurate, Slocum argued the totality of data was not yet sufficient to prove atmospheric CO2 had been rising and that a more standardized data set was needed.

Around the same time, oceanographer Roger Revelle and physical chemist Hans Suess were starting to bring nuclear physics to bear on the question (Revelle & Suess, 1957). Their work involved carbon-14, an isotope of carbon present in atmospheric carbon dioxide but not present in fossil fuels (if you’re interested, see my primer on carbon-14). Revelle and Suess and other scientists reasoned that, if atmospheric CO2 levels were increasing due mainly to the burning of fossil fuels, the proportion of atmospheric CO2 containing carbon-14 should be decreasing. In fact, Suess did find that tree rings from recent years were depleted in carbon-14 compared with old tree rings:

Table 5 from Revelle & Suess, 1957. The values in the right column are the percentage reductions in carbon-14 found in tree rings during the indicated years, relative to old tree rings.

But the reductions appeared lower than could be expected based on Callendar’s estimate that the atmospheric CO2 level had increased by some 6% or more. Further, using data on the carbon-14 contents of the atmosphere and of carbonaceous materials extracted from the ocean surface (namely, seashells, fish flesh, and seaweed), Revelle & Suess calculated that a molecule of CO2 in the atmosphere would be absorbed into the ocean surface within an average of about 10 years, and that the overall ocean was mixed within several hundred years. Based on the enormity of the oceans, Revelle & Suess concluded that Callendar’s claims seemed improbable. Moreover, assuming fossil fuels continued to be used at about the rate they were being used in the mid-1950’s, they calculated that the ocean would prevent anything but a modest increase in atmospheric CO2 well into the future.

Guy Callendar’s “last word” during this period was in a 1958 paper applying an additional 20 years of measurements and analysis to his 1938 catalog of atmospheric CO2 measurements, as shown in this graph:

Figure 1 from G. S. Callendar, 1958. Numbered points are measurements of the concentration of CO2 in the free air, North Atlantic region, 1870-1956. Black line is the calculated CO2 from combustion of fossil fuels.

Dr. Browne & Mr. Escombe’s year 1900 measurements of about 290 ppm CO2 are the point labelled “d” in the plot. The atmospheric CO2 concentration in the North Atlantic region appeared to have increased to around 320 ppm by the year 1956. At the same time, Callendar (1961) and Landsberg & Mitchell, Jr. (1961) independently continued to document that the Earth, at all latitudes, had been warming over the same period:

Figure 3 from Callendar, 1961. Temperature fluctuations for the zones of the Earth, 5-year annual departures from the mean 1901-1930.

Callendar acknowledged the contradiction between his analyses and the carbon-14 measurements, but was unapologetic:

“. . . the observations show a rising trend which is similar in amount to the addition from fuel combustion. This result is not in accordance with recent radio carbon data, but the reasons for the discrepancy are obscure, and it is concluded that much further observational data is required to clarify this problem.”

On the need for further measurements, Callendar, Revelle, Suess, and other scientists agreed. If you read the linked papers on this page, you’ll find many mentions of the upcoming International Geophysical Year (1957-1958), a period of international governmental funding of Earth sciences interestingly intertwined with a Cold War competition for scientific prestige, the launching of the first satellites by the Soviet Union and the United States, and the beginning of the Space Race. As you will see in the next episode of this series, new measurements were coming largely as a result of this funding.

This period is a confusing chapter of climate science, but it presents a terrific example of the self-correcting nature of the scientific method. Pioneering scientists like Callendar test obscure hypotheses, often relying on scant initial data. Their conclusions, if compelling, inspire other scientists both to make more measurements and to check their work. “Watchdog” scientists (like Slocum) point out deficiencies in their analyses. Scientists from other disciplines (Plass, Revelle & Suess) apply alternative techniques to see whether the results are consistent. Predictive scientists (Plass) extend the conclusions of early work to formulate predictions that can be tested. If a hypothesis is correct – if it’s the truth – then any accurate measurement will confirm it. Any prediction based on it will come true. Where there is an apparent contradiction, or where a prediction fails to come true, more measurements are needed to resolve the contradiction.

Keep this mind as we go forward. We will, of course, be applying these principles to the findings supporting the hypothesis of anthropogenic global warming. But also bear in mind that any alternative hypothesis must stand up to the same tests. It’s not enough to say, as my own Senator Ron Johnson (R-WI) did,

“It’s far more likely that it’s sunspot activity or just something in the geologic eons of time.” [Journal Sentinel 8/16/2010]

Well, okay, if it’s sunspots or something (what thing?), let’s see the data. Do measurements of sunspot activity correlate with our observations of Earth’s climate? Scientists have been thinking about and studying this since the 1800’s and making concerted measurements since the early 1900’s. We should be in a position, after all that work, to support our claims with evidence.

Read on, as we get into the data that resulted from the calls for study by Callendar, Revelle, and Suess. As for the “geologic eons of time,” we will actually take a look at that, too. Has the scientific controversy evident in this episode persisted? Or, are people who claim it’s controversial stuck in the ’50’s? Read on to find out!

Back to page contents

Episode 4. Dave Keeling persists in a great idea

In 1953, Charles David (“Dave”) Keeling, a just-graduated Ph.D. chemist with an interest in geology, was looking for a job. He got one as a postdoctoral researcher at Caltech, where a professor employed him to experimentally confirm a rather esoteric hypothesis about the balance between carbon stored in limestone rocks, carbonate in surface water, and atmospheric carbon dioxide. To do this, Dave realized he would first need to have a very accurate estimate of the CO2 content of the air. In investigating the available data on that subject, he found what we encountered at the end of Episode 3 – a great deal of variability in the reported measurements. In fact, it had become widely believed that the CO2 concentration in air might vary significantly from place to place and from time to time, depending on the movements of various air masses and local effects due to the respiration of plants, etc. Dave decided he would need his own way of very accurately measuring the CO2 concentration in air.

Dave developed a new method of measuring the CO2 content of air by collecting air samples in specialized 5-liter flasks, condensing the CO2 out of the air using liquid nitrogen (which had just recently become commercially available), separating the CO2 from water vapor by distillation, and measuring the condensed CO2 volume using a specialized manometer he developed by modifying a design published in 1914. Dave’s new method was accurate to within 1.0 ppm of CO2 concentration. If you’re interested, you can read more about it in his 1958 paper, “The concentration and isotopic abundances of atmospheric carbon dioxide in rural areas,” in which he reported the results of repeated atmospheric CO2 measurements he made at 11 remote stations, including Big Sur State Park, Yosemite National Park, and Olympic National Park, at different elevations and at all times of the day and night.

In his autobiographical account, Keeling admitted he took many more air samples than probably required for this work largely because he was having fun camping in beautiful state and national parks. The great number of samples paid off, though, as they enabled him to make some important observations about daily fluctuations in the atmospheric CO2 level. He found that, in forested locations, maximum CO2 concentrations occurred in the late evening or early morning hours and minimum CO2 concentrations occurred in the afternoon. In non-forested locations, the CO2 concentrations were very similar to the minimum (afternoon) levels measured in forested locations, as well as earlier published levels in maritime polar air collected north of Iceland. In all these locations, the minimum measured CO2 concentrations were pretty consistent, in the range of 307-317 ppm. By isotopic analysis of the carbon-13/carbon-12 ratio of CO2 collected in the forested areas, Keeling determined that the elevated CO2 levels measured at non-afternoon hours in forested areas were due to respiration of plant roots and decay of vegetative material in the soil. He posited that afternoon meteorological conditions resulted in mixing of the near-surface air layer influenced by vegetative processes with higher air that was constant in CO2 concentration.

Basically, the results of Dave’s camping adventures with 5-liter vacuum flasks suggested three important conclusions: (1) care should be taken to sample air using specific methods and under conditions not influenced by industrial pollution or vegetative processes (sample at rural locations in the afternoon); (2) if such care was taken, maybe the CO2 concentration in the atmosphere was virtually the same everywhere, from the old-growth forests of Big Sur to the pristine sea air north of Iceland; and (3) if that was the case, the global atmospheric CO2 concentration in 1956 was about 310 ppm.

Federal agencies, including the US Weather Bureau, were working to identify scientific studies to undertake using the substantial government geophysical research funding anticipated during the International Geophysical Year. Dave reported to a US Weather Bureau researcher his new CO2 measurement method and his results pointing to a potential constancy of global CO2 levels. This resulted in Dave’s installation at the Scripps Institution of Oceanography, directed by Roger Revelle and his associate, Hans Suess. You may remember Revelle and Suess from Episode 3. They were in the midst of publishing a paper concluding that much of the excess CO2 from fossil fuel combustion should be rapidly conveyed into the deep oceans. However, they remained intrigued by Callendar’s analyses, apparently to the contrary, and thought it worthwhile to undertake a dedicated program of atmospheric CO2 measurements at multiple locations.

With funding from Scripps and the US Weather Bureau, Keeling was to make continuous CO2 measurements with a newly developed infrared instrument at remote locations on a 13,000-foot volcano at Mauna Loa, Hawaii and at Little America, Antarctica. The infrared instruments were to be calibrated by the gas sampling technique Dave had developed at Caltech, and 5-liter flasks were to be collected from other strategic places on the Earth, including on airplane flights and trans-ocean ships. The measurements commenced at Mauna Loa, Hawaii in 1958, and the first measured CO2 concentration was 313 ppm.

Continuous weekly CO2 measurements have been conducted at Mauna Loa ever since. The results are freely available to the public here. You can download the data yourself (as can, presumably, House Representatives, Senators, and the President). I did, and I plotted the weekly measurements as this blue curve which has become known as the “Keeling Curve:”

Blue line is the “Keeling Curve,” a plot of weekly atmospheric CO2 measurements made by the Scripps Institute of Oceanography at Mauna Loa, Hawaii from 1958 to present. The blue curve was plotted by me using Scripps weekly data from the Mauna Loa observatory, downloaded here. For fun and context, I added some significant human events to the Earth’s recent CO2 timeline.

Keeling’s very first observation was a seasonal cycle in atmospheric CO2 concentration. The atmospheric CO2 concentration reached a maximum in May, just before the local plants put on new leaves. It then declined, as the plants withdrew CO2 from the atmosphere through photosynthesis, until October, when the plants dropped their leaves. This was, incredibly and quite literally, the breathing of the Earth, which you can clearly see in Keeling’s first measurements (1960, 1963, 1965).

Figures 9a and 9b from Pales & Keeling, 1965. Atmospheric CO2 measurements made at the Mauna Loa Observatory in 1958 and 1959.

The first few years of measurements also confirmed remarkable agreement between measurements taken at Mauna Loa, in Antarctica, on trans-Pacific air flights, and at other locations:

Figure 1 from C. D. Keeling, 1960.

By 1960, the Scripps workers had concluded that the average atmospheric CO2 concentration was rising year-on-year. As you can see by the blue curve above, both the seasonal “breathing” of the Earth’s plants and increasing average CO2 concentration, measured at Mauna Loa, have continued every single year, without interruption, since Keeling’s first measurement in 1958.

No informed person disputes the correctness of the blue curve above. The Mauna Loa CO2 record makes the most compelling graph because it is our only uninterrupted CO2 record. But it has been corroborated for decades by many other scientists who have made measurements all over the world. The Scripps Institution of Oceanography has made measurements at 12 sampling stations from the Arctic to the South Pole, and spread across the latitudes in between. You can get daily updates of the Mauna Loa CO2 concentration here. The National Oceanic and Atmospheric Administration also operates a globally distributed system of air sampling sites, based on which it calculates a global average atmospheric CO2 concentration that is periodically updated here.

In fact, we now know 57% of the CO2 produced by the burning of fossil fuels has stayed in the atmosphere, according to the Mauna Loa CO2 record (see here for more information). So, what about the analysis of Roger Revelle and Hans Suess (1957) from Episode 3, which suggested the CO2-absorbing power of Earth’s deep oceans would save us the hassle of worrying about our CO2 emissions? The early 1957 conclusions were based on measurement of the steady-state rate of exchange of CO2 between air and seawater. That is, the average time a CO2 molecule floats around in the atmosphere before it is “traded” for one dissolved in the surface of the ocean, independently of any net change of the CO2 concentration in either the air or the seawater. Revelle and Suess estimated that steady state exchange rate at around 10 years, and reasoned this meant that, if new CO2 were introduced into the atmosphere, a matching increase in the CO2 surface concentration of the seawater would occur within about 10 years.

Around the same time Dave Keeling was beginning his CO2 measurements at Mauna Loa, Roger Revelle and other scientists were learning the above assumption ignored an important buffering effect of the dissolved salts in seawater, which cause seawater to “resist” increases in its CO2 concentration (see more in this 1959 paper). Thus, when the concentration of CO2 in the atmosphere increases, the net concentration of CO2 in the ocean surface increases by an amount over 10 times less. After decades of further study, this buffering effect is well understood and is routinely measured in the oceans as a quantity known as the Revelle Factor. It explains why Callendar was right about increasing atmospheric CO2, and why we can’t count on the deep oceans to help with our CO2 problem on any but geological time scales of several thousands of years (for more details see this paper).

So, at least, we can say with certainty we’ve settled the question of whether combustion of fossil fuels has increased atmospheric CO2. A multitude of independent measurements tell us it has. When we started this story in Episode 1 around the year 1900, the atmospheric CO2 concentration at the Royal Botanical Gardens was 290 ppm. Dave Keeling’s first measurement at Mauna Loa in 1958 was 8% higher. When I first watched Star Wars at the drive-in in 1977, the CO2 concentration in the air around me was 16% higher. By the time Barack Obama was elected President in 2008, it was 32% higher. The March 18, 2017 Mauna Loa reading was 406.92 ppm, 40% higher than the CO2 concentration in the year 1900. As you can see by the upward bend of the blue curve above, the atmospheric CO2 concentration is increasing at an accelerating rate.

So, how big is that change in the context of Earth’s history? To find out, it would seem we would have to go backward in time. As it turns out, we can! (Sort of.) Stay tuned for more!

Back to page contents

Episode 5. Icy time capsules

In Episode 4, we saw Dave Keeling and coworkers discover the atmospheric CO2 concentration has been on a marked upward sweep, from about 290 ppm in 1900 to over 400 ppm now, and accelerating. Well, is that unusual? Is that a big swing? Or, does the CO2 concentration vary a lot due to natural causes?

Since Dave Keeling only began our continuous, high-accuracy CO2 measurements in 1958, it would seem we would need a time machine to figure that out. In some of the loneliest places on Earth, it turns out, nature has been quietly making time capsules for us.

In parts of Greenland and Antarctica, the snow never melts. In between the snowflakes, tiny volumes of air are trapped. As the years go by, each layer of snow is compacted under new layers. The snow is eventually compacted into ice, and the air is entrapped in minute, isolated bubbles. Geologists in heavy coats prospect for those historical bubbles, little bits of past atmospheres. Good spots to prospect are where it snows very often, such that the snow and ice are deep and the annual layers thick. One such place is Law Dome, Antarctica, a coastal location of Antarctica where the snowfall is as much as 225 lbs of snow per square foot per year.

(A) Field tents at Law Dome, Antarctica (Australian Antarctic Division). Ice core drilling was conducted in the tent in the foreground. (B) Slice from an ice core showing entrapped, ancient air bubbles (Norwegian Polar Institute). (C) Section of an ice core showing visible seasonal layers (Wikimedia Commons). (D) A researcher selects ice cores for greenhouse gas analysis at an Australian ice core storage facility (Australian Antarctic Division).

Ice cores are drilled out using cylindrical drills. Layers in the ice are dated, sometimes visually (see image C above), most times using more sophisticated methods. For example, a rare, heavy isotope of oxygen, O-18, is present in the frozen H2O of Antarctic precipitation at a higher concentration in summer than in winter. Thus, the years in an ice core can be counted as summer stripes and winter stripes, through isotopic analysis of the oxygen in ice layers using a mass spectrometer.

Scientists in the 1980’s expended considerable effort developing accurate methods of harvesting and measuring the composition of the old atmospheric air trapped in ice core bubbles. Since CO2 is water soluble, it’s important not to allow any of the ice to melt while you’re getting the air out. The figure below, from a 1988 paper, shows a schematic diagram of an apparatus used to measure the CO2 concentrations in gas samples retrieved from Law Dome ice cores. This has become known as the “cheese grater” technique, and is still used for CO2 analysis of ice cores.

Figure 1 of Etheridge, Pearman & de Silvia, 1988. Schematic diagram of “cheese grater” and associated gas condensing equipment for harvesting ice core air samples for analysis.

In a cold room (to prevent any melting), an ice core section is inserted in a cylinder with raised cutting blades on the inside, like an inside-out cheese grater. This is put inside a vacuum flask and shaken on a machine, crushing the ice inside. The released gases are sucked by a vacuum pump over, first, a water vapor trap, cooled to -100 degrees Celsius, to condense and remove water vapor. The dry sample is then made to flow over a “cold finger,” cooled by liquid helium to a frigid -269 degrees Celsius, cold enough to condense to liquid all the gases in the air sample. Once all the gas has been sucked out of the sample, the cold finger is isolated and warmed, and the accumulated gas sample is sucked into a gas chromatograph, a standard piece of analytical equipment for separating the gas constituents from each other and measuring their concentrations.

Between 1987 and 1993, Australian and French scientists working at Law Dome drilled 3 separate ice cores to depths of as much as three quarters of a mile. Samples of these ice cores have been analyzed by various groups. Below, in green, is a plot of data from a 2006 study of CO2 concentration from these ice cores going back over 2000 years.

Publicly available Scripps ice core-merged data, downloaded and plotted by me. Green: Ice core data from Law Dome, 0 C.E. to 1957 (see references here and here). Blue circles: Average yearly data from atmospheric sampling at Mauna Loa and South Pole, 1958-2016. Blue square: Mauna Loa measurement made on March 30, 2017. Human experience milestones added by me.

The data is publicly available; anyone can download it here. While this is a single data set, it is in agreement with data sets obtained from multiple ice cores, stored in multiple locations, by multiple scientific groups using a variety of methods (for a discussion of agreement between the various data sets, see here). The ice core data overlaps, with a high degree of agreement, with the Keeling Curve of direct atmospheric CO2 measurements made since 1958, shown in blue in the plot above. (Note that the blue data in the above plot are yearly averages, so the seasonal variations we saw in Episode 4 have been “smoothed out.”) No reasonable, well-informed person refutes this data, which has now been replicated by a multitude of independently sponsored research groups and reviewed extensively for years.

The historical CO2 data tells a story of remarkable stability for 90% of human experience since Biblical times. In fact, until around 1850, the atmospheric CO2 concentration averaged 279 ppm and never strayed outside a narrow range between 272 ppm and 284 ppm (see black lines on the plot below):

Plot of Scripps ice core-merged data showing the pre-industrial average (black dashed line) and range (black solid lines) of CO2 concentrations going back to 0 AD.

Around the time of the First and Second Industrial Revolutions (attended by the advent of coal-fired steam engines and the petroleum industry, respectively), atmospheric CO2 began its relentless upward sweep that continues today. By the time Dr. Brown and Mr. Escombe were doing CO2 measurements at the Royal Botanical Gardens around the year 1900, and certainly by the time Guy Callendar and Dave Keeling were publishing their CO2 measurements and analyses starting in the late 1930’s, the atmospheric CO2 concentration had already departed significantly from the pre-industrial range. The March 30, 2017 direct measurement at Mauna Loa was 47% higher than the average CO2 concentration that had persisted, until very recently, since classical antiquity.

The rate of increase of the atmospheric CO2 level is also strongly accelerating. The graph below shows the rate of change of CO2 concentration over the past two millenia. (If you remember your pre-calculus, I obtained the graph below by taking the derivative of the graph above.)

Rate of change of atmospheric CO2 concentration in parts per million per year (ppm/year).

Prior to the Industrial Revolutions, the atmospheric CO2 concentration changed very little from year to year, and the rate of change hovered around zero. Following the Industrial Revolutions, the rate of change was positive much more often than it was negative; the CO2 concentration was increasing. Immediately following World War II commenced an unprecedented period of positive and increasing rate of change of the CO2 concentration. Some climatologists have labelled the time period between the end of World War II and today as the “Great Acceleration.” During this period, the global population doubled in just 50 years, while the size of the global economy grew by a factor of 15 (Steffen, Crutzen & McNeill, 2007). At the same time, the global CO2 concentration has not only increased to levels unprecedented in previous human experience, but the rate of that increase has sped up from year to year. In 2016 (the hottest global year on record), the rate of increase reached 2.24 ppm/year.

The question for us is, how high do we wish to allow the atmospheric CO2 concentration to go? For me, I have to say the data shown above is alarming. The fact that, in spite of the data above, we are still having discussions about “putting coal miners back to work” is terrifying.

“It will bring back manufacturing jobs across the country, coal jobs across the country. Across the energy sector, we have so much opportunity, George. And the last administration had an idea of keeping it in the ground. We need to be more independent, less reliant upon foreign energy sources. And this is an opportunity.” (EPA Head, Scott Pruitt, explaining to ABC News Anchor, George Stephanopoulos, the merits of President Trump’s executive order of March 28, 2017, seeking to redefine the government’s role in protecting the environment)

In a future episode in this series, we will get into the details of how historical temperature records have been created and linked to the CO2 concentrations above. But there is already enough information on this website to show that our prodigious CO2 production, if unabated, will lead to prodigious warming. The physics of the greenhouse effect are well understood and have been refined by scientists since the effect was first proposed in 1824. It is a mathematical certainty that more CO2 in the atmosphere will cause warming. As we saw in Episode 3, physicist Gilbert Plass used this known math and some of the first computers to predict in 1956 that the combustion of fossil fuels would lead to a warming of about 1 degree Celsius by around the year 2000, and that has come to pass.

In a 2013 paper, respected climatologist, James Hansen, and co-workers calculated that the Earth’s fossil fuel reserves are sufficient to raise the average land surface temperature by 20 degrees Celsius (36 degrees Fahrenheit). Try adding that to the summer temperature where you live. Since humans require a wet bulb temperature less than 35 degrees Celsius (95 degrees Fahrenheit) to maintain body temperature, this temperature change would literally make most of the Earth uninhabitable for humans in the summer. As an engineer, it’s impossible for me to imagine a workable adaptation for this problem that could be accomplished on the short time scale over which this change is presently on track to occur. In fact, given the comfortable stability in CO2 concentration humans have “grown up” with, there is nothing to suggest our social systems are prepared to deal with many of the consequences of the rapid climate changes we would experience on the current trajectory. Our farm land will be moving toward the poles. (Will we then clear more carbon-absorbing forests as it moves?) Our most valuable coastal real estate will be submerged.

As for the consideration of jobs, I suspect it will always be plausible to make the argument that jobs in fossil fuel reliant sectors of our economy will be eliminated by shifting to more sustainable sources of energy. It seems to me that new jobs will be created making solar panels, solar concentrators, and wind turbines. With respect to energy independence, I would argue that the sun shines and wind blows in all regions of the Earth. In any case, given the conclusions of the last paragraph, it would seem the only reasonable conclusion is, yes, as much as it may pain us, we will need to leave much of our remaining fossil fuels in the ground.

Back to page contents

Episode 6. The “geologic eons of time”

“I absolutely do not believe in the science of man-caused climate change. It’s not proven by any stretch of the imagination. It’s far more likely that it’s sunspot activity or just something in the geologic eons of time.”
-My own U.S. Senator, Ron Johnson, R-WI (Journal Sentinel, August 16, 2010)

“It’s a very complex subject. I’m not sure anybody is ever going to really know.”
-Donald Trump (New York Times interview, November 22, 2016)

“I think that measuring with precision human activity on the climate is something very challenging to do…”
-Scott Pruitt, EPA Administrator (CNBC Interview, March 9, 2017)

Mr. Pruitt is right, of course. Measuring with precision [the influence of] human activity on the climate is indeed challenging. Just like landing folks on the moon and returning them safely home. Or sending automobile-sized robots to drive themselves around on Mars taking photographs and analyzing soil samples and sending the results back to us on Earth. Or making giant aluminum tubes with wings that can carry hundreds of people by air to destinations anywhere on the globe in 24 hours or less with a safety record better than that of horse-drawn carriages. Or eradicating smallpox. Or making it possible for most of us to communicate with one another using our voices, text, images or videos, globally, in real time and at a moment’s notice, with little wireless devices we carry around in our pockets.

Once you recall we have accomplished all those rather challenging things, you may not be shocked to learn we have, indeed, also measured with precision the influence of human activity on the climate. Not only that, as we have seen in previous episodes and will continue to see, scientists have made these high-precision measurements publicly available. Anyone with web access can download and review much of the data. The detailed methods with which the precision measurements were conducted, and the resulting data analyzed, are also publicly available in scientific publications, the quality of which have been verified through peer-review (many of these are accessible as links on this website). Presumably, as Head of the United States Environmental Protection Agency, Mr. Pruitt has ready access to means for reviewing the precision measurements at his convenience.

And, as it turns out, we don’t have to speculate, as Senator Johnson evidently does, about mysterious somethings (sunspots maybe?) in the “geologic eons of time.” That’s because, as we saw in Episode 5 of this series, evidence of events during those “geologic eons” is available for study.

In Episode 5, we saw how tiny bubbles of old atmospheres, trapped and preserved in ice as deep as three quarters of a mile below ground at Law Dome, Antarctica, and extracted from ice cores, have enabled us to construct a measured record of atmospheric CO2 concentration over the past 2000 years. Thanks to the exceptionally high rate of snowfall at Law Dome, this 2000-year record has a very high resolution. But ice cores have been extracted at other locations in Antarctica, too, and some of those locations feature deeper ice.

Image credit: U.S. Department of Energy, Carbon Dioxide Information Analysis Center. Map of Antarctica showing locations of ice core drilling operations.

The deepest ice cores have been extracted by the European Project for Ice Coring in Antarctica (EPICA) at Dome C. EPICA has extracted ice cores three miles deep at Dome C, and those ice cores contain air bubbles trapped up to 800,000 years ago. Additionally, a collaborative project between Russia, the U.S., and France extracted ice cores as deep as 2.25 miles below ground at Vostok station, from which have been captured atmospheric samples from up to 420,000 years ago. Combining CO2 measurements from the Dome C ice cores, Vostok ice cores, Law Dome ice cores, and direct atmospheric measurements at Mauna Loa and the South Pole gives us this continuous plot of atmospheric CO2 concentrations going back a whopping 800,000 years:

Publicly available 800 KYr ice core data and Scripps ice core-merged data, downloaded and plotted by me. Original data sources: (A) Dome C (Luthi et al. 2008) measured at University of Bern; (B) Dome C (Siegenthaler et al. 2005) measured at University of Bern; (C) Dome C (Siegenthaler et al. 2005) measured at LGGE Grenoble; (D) Vostok (Petit et al. 1999, Pepin et al. 2001) measured at LGGE Grenoble; (E) Dome C (Monnin et al. 2001) measured at University of Bern; (F) Law Dome (Keeling et al. 2005, Meure et al. 2006); (G) Average yearly data from atmospheric sampling at Mauna Loa and South Pole (“Keeling Curve”); (H) Mauna Loa measurement made on April 29, 2017 (409.76 ppm). Human and other hominid experience milestones added by me with reference to Wikipedia.

More details about the measurement methods and access to the data sets are available at this website and by clicking links to the original scientific publications in the caption above.

The green and blue colored data in the graph above are the 2000-year Law Dome measurements and direct atmospheric CO2 measurements since 1958, respectively, that we plotted in Episode 5. They are shoved way over to the right now, dwarfed in time by the massive amount of historical data collected from the deeper ice cores at Vostok and Dome C.

I’m not sure what Senator Johnson meant by “the geologic eons of time.” But, insofar as we are interested in how CO2 has changed over a time period of interest to the success and survival of humans on Earth, I’d say 800,000 years fits the bill. To put that in context, anatomically modern humans appeared on the planet only 200,000 years ago. So, the CO2 record above goes back 4 times as long as the entirety of human experience. In fact, it goes back 200,000 years longer than fossil evidence of Homo heidelbergensis, the hominid thought likely to be the common evolutionary ancestor of Neanderthals and humans. (I included these and some other human and hominid milestones on the graph above. I find this useful for the purpose of putting geological and human events in perspective.)

In Episode 5, we saw that, over the past 2000 years, humans experienced atmospheric CO2 concentrations between 272 and 284 ppm prior to the Industrial Revolutions when we started to burn gobs of fossil fuels. The data in this episode extends that range somewhat, to a human experience of 184-287 ppm. The maximum pre-industrial concentration in human experience occurred 126,000 years ago, and it was roughly matched at the time of the Second Industrial Revolution, when we started to burn oil at an industrial scale. Since then, it has been up and up, such that our CO2 level as of April 29, 2017 is 43% higher than the maximum CO2 level over the entire pre-industrial experience of humans spanning 200,000 years.

Same plot of atmospheric CO2 concentrations over the past 800,000 years, showing the average pre-industrial CO2 concentration during that period (dashed line), the minimum and maximum pre-industrial concentrations during that period, and the minimum and maximum concentrations during all of pre-industrial human experience (that is, between about 200,000 years ago and the Industrial Revolutions).

And, in the context of the “geologic eons of time,” this is happening quickly! As we did for the shorter data set in Episode 5, we can take the derivative of the graph above to see the rate of change of the atmospheric CO2 concentration in parts per million per year:

Rate of change in atmospheric CO2 concentration in parts per million per year (ppm/year).

The answer is the same as we saw in Episode 5, but it’s all the more striking in the context of an 800,000 year record. Not only are we far above any “natural” CO2 level in the past 800,000 years, since the Industrial Revolutions we have been increasing that CO2 level at a rate much faster than Earth has experienced over at least that time period. And the rate of increase continues to accelerate.

When you hear about “controversy” in climate science, uncertainties about the Earth’s response to this super fast rate of change is what it’s about. It’s not about whether CO2 from our burning of fossil fuels is causing global climate change. (It is.) The uncertainty (which the popular media may refer to as “controversy”) is about how extremely and how quickly Earth’s climate will respond to the rapid change in atmospheric CO2. Questions like: How quickly will the land-based ice sheets in Antarctica and Greenland melt, contributing to sea level rise? How much and how quickly will the reduced reflectivity of the Earth, as a result of the melting of the reflective snow and ice, contribute to additional warming?

To a scientist, like myself, who is experienced in rate-of-change graphs, the plot above is terrifying. It’s what we refer to as, “going vertical.” That is, departing from the normal process at an accelerating rate. I, myself, am a product developer experienced with defining and controlling the conditions required to manufacture new products. People like me want to keep a graph of a critical process parameter (in this case, CO2 concentration) within narrow limits. From this point of view, the Earth has “manufactured” humans. This has occurred, until very recently, within narrow limits of the atmospheric CO2 concentration. We are now departing rapidly from those narrow limits. As an engineer, I would say we need to get that critical process parameter back in control, as soon as possible. Otherwise, we risk a failure of our manufacturing process. Since the manufactured product, in this case, is us, we have a strong interest in getting the process under control.

In the next episode, we link the historical CO2 record directly to the global temperature record.

Back to page contents

Episode 7. Our global thermometer since 1850

In the last 3 episodes of our history of global climate change evidence, we’ve focused on measurement of Earth’s atmospheric CO2 record, finding in the last episode that it’s now over 40% higher than the entire pre-industrial experience of the human species spanning over 200,000 years. But we have not checked in on global temperature measurements since Episode 3, where the intrepid steam engineer, Guy Callendar (1961) and Landsberg & Mitchell, Jr. (1961) had independently measured what appeared to be a slight but discernible warming between 1880 and the late 1950’s. You may also recall from Episode 3 that the physicist, Gilbert Plass, had used some of the first computers to refine calculations of infrared absorption by CO2, predicting we would observe about a 1 degree temperature increase between the years 1900 and 2000, whereupon we would also begin to observe obvious effects of climate change.

Well, the year 2000 has come and gone and we have thermometers all over the world. Let’s grade Dr. Plass’ work, shall we?

In the 1960’s and 1970’s, others continued to document surface temperature records from collections of meteorological stations, but the data were gathered primarily from stations in the Northern Hemisphere and there wasn’t a standardized method of obtaining a truly global temperature average. During that time, James Hansen, a physicist and astronomer at the NASA Goddard Institute for Space Studies, was studying the planet Venus. Specifically, he was calculating the influence of Venus’ thick atmosphere on its extremely hot surface temperature. (Fun fact: scientists believe Venus’ atmosphere several billion years ago was similar to Earth’s and it had liquid water on its surface, but Venus now has a thick atmosphere and a scorching surface temperature of 864 degrees Fahrenheit due to the occurrence of a runaway greenhouse effect.)

In the late 1970’s, Hansen turned his attention to similar calculations of the effects of Earth’s atmosphere on its surface temperature. As part of this work, he tackled the problem of creating a standardized method for calculating global average temperature trends. The method begins with the recognition that, while absolute temperatures are widely variable from place to place on the Earth, even for locations relatively close to one another, temperature changes of nearby locations tend to be very similar. For example, while the absolute temperatures in New York and Pittsburgh might be quite different on a particular day, if one is having a hotter than average month, the other is likely having a month hotter than average by around the same amount. Thus, global temperature trends are plotted, not as absolute temperatures, but as temperature differences, called “temperature anomalies,” relative to some reference temperature.

The second key element of the method is the Earth’s surface is divided up into a grid formed by squares of equally spaced latitude and longitude lines, such that each square contains a sufficient number of weather stations to obtain an accurate record of historical temperature data. At any given time in history, then, the temperatures of the squares are averaged to get an estimate of the global average temperature. Various statistical methods are used to correct for errors, such as the known artificial urban warming around weather stations in or near cities. The gathering of sufficient, widespread temperature data to apply this method began in the late 1800’s. Hansen’s method was initially published in the peer-reviewed scientific journal, Science, in 1981, and has since been updated as the techniques have continued to improve (1987, 2010).

Similar methods have now been applied independently by four major research groups. They make their data publicly available for download (see links in the caption below). Here are the four readings of the “global thermometer” (orange, pink, red, and purple lines) plotted on top of the global CO2 record (green and blue circles) we saw in Episode 5:

All data publicly available, downloaded and plotted by me. Green and blue circles: atmospheric CO2 concentration from Law Dome ice cores (green) and direct atmospheric sampling (blue) from Scripps (see figure captions in Episode 5 for detailed references). Orange line: Temperature anomaly, 1880-2016, according to U.S. NASA Goddard Institute for Space Studies (public data, reference). Pink line: Temperature anomaly, 1880-2016, according to U.S. NOAA National Climatic Data Center (public data, reference). Red line: Temperature anomaly, 1850-2017, according to U.K. Hadley Centre/Climate Research Unit (public data, reference). Purple line: Temperature anomaly, 1891-2016, according to Japan Meteorological Agency (public data, reference). All temperature anomalies re-scaled by me to be relative to a common reference baseline of the 1891-2010 average temperature.

Due to differences between the chosen data sources, gridding methods, and error correction methods used by the four independent groups (for details, see references in the caption above), the four temperature records are not identical. They show remarkable agreement, however. They generally have peaks and valleys in the same places, and their basic conclusions are all the same – the world is about 1.1 degrees Celsius warmer now than it was in pre-industrial times. Check out the video below, where the NASA and NOAA gridded data have been used to show how different parts of the globe have changed in temperature.

Video credit: NASA Goddard Space Flight Center (link to web page). Video using a color coding of NASA and NOAA gridded global temperature anomaly data to show how the Earth’s temperature has changed since 1880.

There is no obvious evidence of a “Chinese hoax” here. Instead, these appear to be the serious, well considered and extensively peer-reviewed conclusions of four independently funded and well-respected scientific groups (a British group, a Japanese group, and 2 U.S. groups – one of which, NASA, has brought us other generally well-regarded scientific achievements such as the moon landings).

In their 1981 paper, James Hansen and his coworkers calculated the temperature increase, relative to the global temperature around 1975, at which we would have a greater than 98% statistical confidence that global warming is “real” (not just a result of random temperature variations). That would be when the temperature rose above the light grey range in this graph, about 0.2 degrees C higher than the 1975 temperature, which the NASA scientists predicted would occur in the 1990’s.

Figure 7 from Hansen, et al. (1981). Calculation of the temperature change, relative to the temperature in the late 1970’s, at which our statistical confidence that global warming had exceeded previous natural variation would reach >85% confidence (represented by the dark grey range) and >98% (light grey range).

A look at the temperature data above shows that had indeed occurred by the 1990’s. Now, we are a full 0.8-1.0 degrees C above the 1975 temperature, and there can really be no doubt.

Strikingly, the temperature graphs above have almost exactly the same shape as the CO2 graph! But, if we’ve been paying attention to our history of evidence, this should not be a surprise. Rather, it should be a confirmation of our expectations. Sure, the Earth’s climate is a highly complex system, and there have been real questions about things like the role of the deep oceans, as we saw in Episode 3. But those questions were settled by around 1960, by which time Dave Keeling had also begun direct measurements of the atmospheric CO2 concentration. Once we see CO2 going up, we expect warming with mathematical certainty. Based on physics known since the early 1800’s, CO2 absorbs infrared radiation reflected from the Earth’s surface, generating heat. It’s as simple as that. At the end of the day, the basic physics driving global warming are far simpler than those at work every moment inside your smart phone.

Anyone denying the reality of global warming would have to not only explain why at least four formidable groups of well-respected scientists, not evidently influenced by Chinese hoaxters, don’t know how to process data from thermometers. They would also need to explain how the undeniable increase in atmospheric CO2 through the combustion of fossil fuels has somehow not resulted in warming, when anyone with a basic laboratory infrared instrument can verify the infrared absorption of CO2. In fact, did you notice in Episode 4 how the weekly atmospheric CO2 concentration at Mauna Loa is measured? By the infrared absorption of collected air samples! So, every time we measure the CO2 concentration of the atmosphere, by the method precise enough to reveal the seasonal respiration of plants, we verify the very physical phenomenon that drives global warming!

OK, so it’s time to grade Dr. Gilbert Plass’ 1956 prediction of around 1 degree Celsius of warming between 1900 and 2000, and readily observed effects of global climate change, due to infrared adsorption by increased atmospheric CO2. The verdict?

  • Actual warming between 1900 and 2000? Around 0.8 degrees C. Not bad. Maybe an A-. But pretty impressive given that Dr. Plass was using the world’s very first computers and considering only the effects of infrared absorption by CO2.
  • Readily observable effects of global climate change? Absolutely.

Episode 8 in preparation. To be continued…

Back to page contents