See? 2016 is off the charts hot. (The Guardian)
The escalating urgency of global warming is undeniable. Global temperatures are smashing, shattering, surpassing, and scorching records and already we can see the unprecedented effects of climate change including rising seas, more extreme drought and floods, coral bleaching, melting ice sheets, and shrinking sea ice.
If that weren’t bad enough, the temperatures reported in this chart and most news articles underrepresent the extent of the warming that has taken place since humans started pumping carbon into the atmosphere.
Scientists look at temperature anomalies, or deviations from normal, in order to study patterns in global climate and temperatures. When calculating the temperature anomaly, scientists compare their latest measurements to a representative benchmark period within their data set. Because NASA’s data set is most robust during the 20th century, NASA benchmarks its temperature anomaly against the average temperature of the period from 1951-1980. These three decades end up being pretty representative of the 20th century average. NOAA, in a similar move, uses the 20th century average, a slightly different benchmark. Inconsistency in the benchmarks across agencies means you have to look carefully at the fine print before interpreting the meaning of any given temperature anomaly.
But while these reported anomalies differ in their benchmarks—20th century average v. average between 1951-1980 for the major United States agencies—they share a common trait: the logic of the benchmark chosen is internal to the data set collected by the agency, rather than to the beginning of the anthropogenic warming period.
Thus these numbers chronically underrepresent global warming.
The Political Benchmark:
We began pumping carbon into the atmosphere at the start of the industrial revolution, nearly two centuries before the most commonly used scientific benchmark periods. Thus nearly 200 years of anthropogenic warming are not included in the anomalies released by NASA and NOAA.
Baselines and reference points matter. We have to square the numbers scientists present each month against our policy goals.
That’s where 2-degrees Celsius comes in.
In December 2015, nearly 200 nations came together at COP21 in Paris to keep the global average temperature “well below 2-degrees Celsius above pre-industrial levels.” There is a whole history of how 2-degrees was negotiated and adopted as the international standard, but in short, the global community set for itself a temperature ceiling that scientific models deem likely to avoid the worst effects of climate change.
With the latest temperature reports, where do we stand against this objective?
Reports from 2015 sound the alarm bells that, for the first time, the yearly global average temperature was 1-degree Celsius warmer than the late-19th century average.
NASA reports equally dire increases for January, February, and March (1.13-degrees, 1.34-degrees, and 1.28-degrees, respectively) above their respective monthly average, benchmarked against NASA’s usual 1951-1980 average temperature for each month. A strong El Nino cycle, which brings hotter temperatures across the globe, contributed to this increased warming (the Met office in the UK estimates that El Nino accounts for approximately 0.2-degrees of the 2015/2016 anomaly), but El Nino is not sufficient to explain the whole increase. The rest is on us.
So it is bad. The earth is heating faster than expected. But to gain a full picture of the urgency of these numbers, we have to ask ourselves: what benchmark are these temperature anomalies referencing?
It’s time to return to Paris and the 2-degree baseline. The temperature anomalies above are calculated against a mid-20th century average, but Paris calls for a 2-degree limit against pre-industrial temperatures. This means that, depending on when you start the clock on the industrial revolution, there is at least 100 years, if not 200 years, of warming not accounted for in anomalies calculated from a mid-20th century average. To reach values that can be measured against the Paris 2-degree goal, we have to correct for those lost years.
Climate Central has written an excellent blog on temperature anomalies, adjusting to a late 19th century baseline. In calculating this anomaly, Climate Central used observed temperatures from the late-19th century to benchmark their figures.
But they stop short of accounting for pre-industrial conditions.
To adjust the reported figures to a pre-industrial baseline, NASA estimates that 0.4-degrees Celsius warming occurred between 1850-1950. This is a rigorous estimate, but it utilizes modeling rather than direct observation because we do not have direct observed temperatures before the late-19th century. However, the mid-19th century is a moderate definition of the beginning of the industrial revolution—the first central coal-fired power plant in the U.S. would not open until 1882 and the internal combustion engine was still nascent – so that is a reasonable date to begin to try and understand pre-industrial warming.
These are scary numbers. We are closer to 2-degrees than we thought. With this adjustment, each anomaly has passed a significant benchmark: the first three months of 2016 were each 1.5-degrees or more above the 1850 baseline. 2015’s global average raced past the 1-degree mark widely hailed as a significant climate milestone, to 1.3-degrees above the pre-industrial base.
But industrial manufacturing and accelerated deforestation produced significant greenhouse gas emissions prior to 1850. Many historians count the 1760s and the advent of the steam engine as the start of industrialization. During this time, anthropogenic emissions began to tick up. Michael Mann, a climatologist at Penn State University, has made a preliminary estimate of an additional adjustment factor to take this prior century’s warming into account: 0.2-degrees Celsius for the Northern Hemisphere, or approximately 0.15–degrees for the global average.
When this factor is taken into account, the earth is seen careening quickly toward the 2-degree limit. 2015 saw a yearly average of 1.5-degrees Celsius above the pre-industrial benchmark. And while it takes several years above a threshold to know we have passed that threshold permanently, each of the first three months of 2016 have hovered 1.7-degrees or more above the pre-industrial baseline (assuming that the annual-average adjustment is approximately correct for monthly anomalies as well). February, the hottest month relative to its norm, toys with the 2-degree threshold at 1.89-degrees.
In the final hours of the UNFCCC negotiations, the “High Ambition Coalition” composed of the U.S., European Union states, island states, and other nations rallied support for even greater ambition. 2-degrees, they said, was too high. To approach it would be to flirt ecological and humanitarian disaster. They succeeded in including language in the agreement calling for efforts to limit warming to 1.5-degrees Celsius.
The calculations above show that we have already crossed that line. Let that sink in.
A few months represent just that: a few months. We will not know until after the fact whether we have crossed the 1.5-degree threshold temporarily or for good. To know how this trend plays out, we need time. But that is a luxury we can no longer afford.
But the parade of “hottest” headlines has been insistent. These latest observations insist that climate change is happening faster and stronger than hoped, understood, or expected.
As we take stock of these real-time changes in the earth system, it is crucial that we put the observed temperatures in context of the larger conversation around climate and that, when presented with numbers, we know where they came from and what they mean.