September 26, 2017 Categories: Clean Energy Climate Change Fossil Fuels

Can We Hold Global Warming to “Well Below” 2 Degrees?

NextGen Policy Center

by Dan Lashof

The Paris Climate Agreement (which is not going to be renegotiated, regardless of what President Trump wants) calls for holding global warming to “well below 2°C above preindustrial levels” and trying to limit warming to 1.5°C. Two important recent papers examine what would be required if the world takes those aspirations seriously.

The short answer is a lot more than governments are currently committed to. Both papers agree, in broad terms, that we need to rapidly reduce emissions of HFCs, methane, black carbon, and ozone precursors (so-called short-lived climate pollutants or SLCPs) and to zero out carbon dioxide emissions in the second half of this century. This is physically possible, but extremely challenging to say the least. On the other hand, it is increasingly clear that the risks posed by climate change are already severe, as this past summer of intense hurricanes, floods, and wildfires showed. Blowing past the 1.5 degree guardrail risks global catastrophe.

Underneath these topline takeaways, there are significant differences between the approaches and conclusions of the two papers, and even larger ones in how they have been covered in the media.

First out was “Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes,” published as a Perspective in the Proceedings of the National Academy of Sciences by Yangyang Xu and Veerabhadran Ramanathan on September 15th, 2017. Xu and Ramanathan define “dangerous” warming as greater than 1.5°C and “catastrophic” warming as greater than 3°C. They treat risk and uncertainty explicitly and set a goal of having at least even odds that warming will be held to less than 1.5°C and at least a 95 percent chance of avoiding catastrophic warming. They derive a probability distribution for how much warming will occur under different global warming pollution scenarios as shown here:

The “Baseline-fast” and “Baseline-default” curves represent emissions scenarios in which the world fails to address climate change and continues to get its energy primarily from fossil fuels, but with two different rates of improvement in energy efficiency. Under these baseline scenarios it is clear that there is virtually no chance that warming will be less than 2°C, let alone 1.5°C. More alarming still, both scenarios include a substantial risk that global warming would exceed catastrophic levels and enter a regime that the authors call “Unknown” —which represents an existential threat to human civilization.

The “Target-WB2C” scenario is designed to produce even odds that global warming won’t reach “dangerous” levels (1.5°C), and a 95 percent probability that warming will be less than “catastrophic” (> 3°C).

This seem like a pretty good goal to shoot for, but is it feasible? Physically speaking, yes. The Target-WB2C scenario assumes strong action to reduce HFCs, methane, black carbon and ozone precursors starting in 2020. For carbon dioxide it assumes that countries follow their Paris commitments to 2030 and then begin more aggressive emission reductions, reaching zero net emissions between 2060 and 2070. These two assumptions are reasonably plausible. The Kagali agreement under the Montreal Protocol promises a phase out of HFCs, and countries have strong independent economic, health, and safety reasons to reduce emissions of methane, black carbon and ozone precursors. As for carbon dioxide, making the transition to a clean energy economy in about 50 years seems possible given that the direct costs of wind and solar have dropped below those of new fossil fuels power plants in most parts of the world already, and the cost of battery storage and electric vehicles are now following a similar trajectory.

Unfortunately, allowing this much time to phase out fossil fuels necessitates a third massive measure: removing about 1 trillion tons of carbon dioxide from the atmosphere by the end of the Century. The scale of this undertaking must not be underestimated: 1 trillion tons is the total amount of carbon dioxide emitted by burning fossil fuels between 1970 and 2010 (or equivalently between 1750 and 1970). The cost of doing so is unknown, but certainly enormous, and the feasibility is questionable.

Given this challenge, Xu and Ramanathan present another option for avoiding dangerous global warming, which they call “Target-1.5C.” In this scenario, we reduce HFCs, methane, black carbon and ozone precursors at the same rate as in Target-WB2C, but we scale down global carbon dioxide emissions far more rapidly: we have to start lowering global emissions by 2020 and get them to zero by 2050. Doing so would obviate the need for massive carbon dioxide removal and limit peak warming somewhat more effectively than in the Target-WB2C case, as show here:

Either way, the imperative is clear, but the task is daunting. We need to rein in emissions of HFCs, methane, black carbon and ozone precursors immediately and phase out fossil fuel combustion as quickly as possible, aiming for zero net emissions by mid-Century. Meanwhile it is only prudent to undertake serious research on carbon dioxide removal because we very well may need it to avoid climate catastrophe.

Unfortunately, the clear message from Xu and Ramanathan was seriously muddled by “Emission budgets and pathways consistent with limiting warming to 1.5°C,” published a few days later by Richard Millar and coauthors in Nature Geoscience. While Millar et al. undertook a similar analysis and reached a similar conclusion, the press coverage of their article has been extremely misleading (aided by very opaque writing, even for climate scientists). The original coverage was then spun out of all semblance to the original conclusion by the climate denier echo chamber, prompting the author team to issue a clarification and two of the authors, Myles Allen and Richard Millar, to push back in an op-ed of their own.

Part of the problem with the coverage of the Millar et al. paper arose from the lead author framing his results as showing that “we have a little more breathing space than previously thought to achieve the 1.5C limit.” Previously thought by who? To be fair to Millar, the paper explains that the additional “breathing space” is relative to a previously published estimate that the remaining carbon budget from 2015 would amount to only seven years of current global emission levels, whereas their analysis suggests that a budget of 20 years of current emissions (i.e., equal to cumulative emissions from a linear reduction to zero over 40 years) would provide a two-thirds chance of limiting warming to 1.5°C. So Millar’s conclusion that holding global warming to 1.5°C was “not yet a geophysical impossibility” represents more “space” only for those who previously thought it was. This key point was lost on—or more likely, intentionally ignored—by the climate denial spin machine.

This conclusion is still slightly more optimistic than Xu and Ramanathan’s, which put the odds of holding global warming to 1.5°C at a little under 50 percent for the same carbon budget. This difference appears to arise primarily from two factors: First, Millar et al. use a median climate sensitivity (the amount of global warming expected eventually from a doubling of the amount of carbon dioxide in the atmosphere) of 2.6°C, slightly lower than the 3°C median sensitivity used by Xu and Ramanathan. Second, and probably more important, is the starting point Millar et al. adopt for assessing what is required to limit global warming to 1.5°C. They begin by estimating that global warming from preindustrial times to 2015 amounted to 0.9°C, and focus their analysis on determining the carbon budget that would limit additional warming to 0.6°C.  Their results depend sensitively on this starting point, but other researchers estimate that 2016 was 1.2°C above preindustrial levels, which would cut the carbon budget by more than half (due to global warming momentum already baked in).

The climate denial spin machine is not interested in these facts. For example, The Daily Caller simply proclaimed “Another Major Study Confirms the IPCC’s Models Were Wrong.” Millar et al. did nothing of the sort. The study was not aimed at assessing the accuracy of climate models and the authors have repeatedly said that this interpretation of their results is flat out wrong. On the other hand, data scientist Zeke Haufsfather has provided a detailed and up-to-date comparison of climate models and observed global warming on CarbonBrief. His conclusion, as shown in the figure below, is clear: the models are remarkably accurate.

So what have we learned from this episode?

  • We still have a chance to hold global warming to 1.5°C if we decarbonize the global economy by 2050.
  • Given how close we already are to surpassing the 1.5°C guardrail, small differences in the definition of “preindustrial levels” and the dataset used to assess global warming to date can lead to large difference in estimates of our remaining carbon budget.
  • Confusing messages can easily drown out clear ones, particularly with the help of the climate denier echo chamber.

Join the Conversation