20 March 2013
How Much Will The Planet Warm If We Double CO2?
Posted by Dan Satterfield
The Yale Forum on Climate Change and The Media has an excellent post up by Zeke Hausfather about climate sensitivity. It’s basically a “how we know what we know” piece on how much the planet will warm if we double the carbon dioxide levels in the Earth’s atmosphere. We are rapidly on our way to doing this, and well before the century is out. It’s well worth a read.
Guest Post from the Yale Forum on Climate Change and The Media:
Climate ‘skeptics’ down-play the sensitivity of Earth’s climate to increased CO2 emissions and concentrations, and so might some policy makers. In the end, it’s the emissions and concentrations that most matter rather than uncertainties about climate sensitivity.
Climate sensitivity is suddenly a hot topic.
Some commenters skeptical of the severity of projected climate change have recently seized on two sources to argue that the climate may be less sensitive than many scientists say and the impacts of climate change therefore less serious: A yet-to-be-published study from Norwegian researchers, and remarks by James Annan, a climate scientist with the Japan Agency for Marine-Earth Science and Technology (JAMSTEC).
While the points skeptics are making significantly overstate their case, a look at recent developments in estimates of climate sensitivity may help provide a better estimate of future warming. These estimates are critical, as climate sensitivity will be one of the main factors determining how much warming the world experiences during the 21st century.
Climate sensitivity is an important and often poorly understood concept. Put simply, it is usually defined as the amount of global surface warming that will occur when atmospheric CO2 concentrations double. These estimates have proven remarkably stable over time, generally falling in the range of 1.5 to 4.5 degrees C per doubling of CO2.* Using its established terminology, IPCC in its Fourth Assessment Report slightly narrowed this range, arguing that climate sensitivity was “likely” between 2 C to 4.5 C, and that it was “very likely” more than 1.5 C.
The wide range of estimates of climate sensitivity is attributable to uncertainties about the magnitude of climate feedbacks (e.g., water vapor, clouds, and albedo). Those estimates also reflect uncertainties involving changes in temperature and forcing in the distant past. But based on the radiative properties, there is broad agreement that, all things being equal, a doubling of CO2 will yield a temperature increase of a bit more than 1 C if feedbacks are ignored. However, it is known from estimates of past climate changes and from atmospheric physics-based models that Earth’s climate is more sensitive than that. A prime example: Small perturbations in orbital forcings resulting in vast ice ages could not have occurred without strong feedbacks.
Water Vapor: Major GHG and Major Feedback
Water vapor is responsible for the major feedback, increasing sensitivity from 1 C to somewhere between 2 and 4.5 C. Water vapor is itself a powerful greenhouse gas, and the amount of water vapor in the atmosphere is in part determined by the temperature of the air. As the world warms, the absolute amount of water vapor in the atmosphere will increase and therefore so too will the greenhouse effect.
That increased atmospheric water vapor will also affect cloud cover, though impacts of changes in cloud cover on climate sensitivity are much more uncertain. What is clear is that a warming world will also be a world with less ice and snow cover. With less ice and snow reflecting the Sun’s rays, melting will decrease Earth’s albedo, with a predictable impact: more warming.
There are several different ways to estimate climate sensitivity:
- Examining Earth’s temperature response during the last millennium, glacial periods in the past, or periods even further back in geological time, such as the Paleocene Eocene Thermal Maximum;
- Looking at recent temperature measurements and data from satellites;
- Examining the response of Earth’s climate to major volcanic eruptions; and
- Using global climate models to test the response of a doubling of CO2 concentrations.
These methods produce generally comparable results, as shown in the figure below.
The grey area shows IPCC’s estimated sensitivity ranges of 2 C to 4.5 C. Different approaches tend to obtain slightly different mean estimates. Those based on instrumental temperature records (e.g., thermometer measurements over the past 150 years or so) have a mean sensitivity of around 2.5 C, while climate models average closer to 3.5 C.
The ‘Sting’ of the Long Tail of Sensitivity
Much of the recent discussion of climate sensitivity in online forums and in peer-reviewed literature focuses on two areas: cutting off the so-called “long tail” of low probability\high climate sensitivities (e.g., above 6 C or so), and reconciling the recent slowdown in observed surface warming with predictions from global climate models.
Being able to rule out low-probability/high-sensitivity outcomes is important for a number of reasons. For one, the non-linear relationship between warming and economic harm means that the most extreme damages would occur in very high-sensitivity cases (as Harvard economist Marty Weitzman puts it, “the sting is in the long tail” of climate sensitivity). Being able to better rule out low probability/high climate sensitivities can change assessments of the potential economic damages resulting from climate change. Much of the recent work arguing against very high-sensitivity estimates has been done by James Annan and Jules Hargreaves.
The relatively slow rate of warming over the past decade has lowered some estimates of climate sensitivity based on surface temperature records. While temperatures have remained within the envelope of estimates from climate models, they have at times approached the 5 percent to 95 percent confidence intervals, as shown in the figure below.
Figure from Ed Hawkins at the University of Reading (UK).
However, reasonably comprehensive global temperature records exist only since around 1850, and sensitivity estimates derived from surface temperature records can be overly sensitive to decadal variability. To illustrate that latter point, in the Norwegian study referred to earlier, an estimate of sensitivity using temperature data up to the year 2000 resulted in a relatively high sensitivity of 3.9 C per doubling. Adding in just a single decade of data, from 2000 to 2010, significantly reduces the estimate of sensitivity to 1.9 C.
There’s an important lesson there: The fact that the results are so sensitive to relatively short periods of time should provide a cautionary tale against taking single numbers at face value. If the current decade turns out to be hotter than the first decade of this century, some sensitivity estimates based on surface temperature records may end up being much higher.
So what about climate sensitivity? We are left going back to the IPCC synthesis, that it is “likely” between 2 C and 4.5 C per doubling of CO2 concentrations, and “very likely” more than 1.5 C. While different researchers have different best estimates (James Annan, for example, says his best estimate is 2.5 C), uncertainties still mean that estimates cannot be narrowed down to a far narrower and more precise range.
Ultimately, from the perspective of policy makers and the general public, the impacts of climate change and the required mitigation and adaptation efforts are largely the same in a world of 2 or 4 C per doubling of CO2 concentrations where carbon dioxide emissions are rising quickly.
Just how warm the world will be in 2100 depends more on how much carbon is emitted into the atmosphere, and what might be done about it, than on what the precise climate sensitivity ends up being. A world with a relatively low climate sensitivity — say in the range of 2 C — but with high emissions and with atmospheric concentrations three to four times those of pre-industrial levels is still probably a far different planet than the one we humans have become accustomed to. And it’s likely not one we would find nearly so hospitable.
Editor’s Note: This piece uses Centigrade rather than Fahrenheit temperature units. The C units in this feature translate to the following F units: 1.5C=2.4F; 4.5C=8.1F;2C=3.6F;3.5C=6.3F;6C=10.8F; and 3.9C=7.02F.
“If the current decade turns out to be hotter than the first decade of this century, some sensitivity estimates based on surface temperature records may end up being much higher.”
And if the rest of this decade is flat— as the latest Met Office decadal forecast suggested is more plausible, than their previous decadal forecasts.. .. sensitivity will presumably be lower?
ie what value sensitivity, if we take the values described here..
It will be interesing to see impacts of the next several years of opbserved temps…
The estimates for climate sensitivity have changed very little over the past decades and any estimate that disagrees with the ice core sensitivities certainly has to be suspect. Zeke makes just that point in piece. It’s also worth pointing out that it is very likely we will see a decade of falling temps. in this century as we warm. Looking at a decadal swing in temp. in relation to climate sensitivity is pretty worthless.
” the Norwegian study referred to earlier, an estimate of sensitivity using temperature data up to the year 2000 resulted in a relatively high sensitivity of 3.9 C per doubling. Adding in just a single decade of data, from 2000 to 2010, significantly reduces the estimate of sensitivity to 1.9 C”
That is a complete misconception. The Skeie et al. study referred to (as originally submitted) estimates, using data extending to 2010, that climate sensitivity is most likely to be 1.7 deg. C. Using the same data but ending in 2000, the corresponding estimate is 2.0 deg. C. You can see this if you look at where the peak top two graphs in slide 6 at http://www.uib.no/People/ngfhd/EarthClim/Calendar/Oslo-2012/ECS_Olavsgard.pdf . Further, I estimate that the 2.0 deg. C figure would have been only 1.8 or 1.9 deg. C had a noninformative prior, rather than a uniform prior, been used in the Bayesian estimation methodology.
The 1.9 and 3.9 (should be 3.8?) deg. C figures are mean values, not most likely estimates. A mean is not a suitable measure for the central value of skewed distributions. The increase from 1.9 to 3.9 deg. C almost entirely reflects the PDF based on data only up to 2000 being far worse constrained than when using data up to 2010. Since climate sensitivity PDFs, for good physical/mathematical reasons, normally have tightly constrained lower tails but much worse constrained upper tails (particularly when using a uniform prior, which distorts the PDF), as the PDF becomes worse constrained its mean will progressively increase even if the most likely (best-fit: the mode) estimate – indeed the median estimate – remains unchanged.
If you look at the second graph down on the LHS of slide 9, which uses three rather than one ocean heat content estimates with data ending in 2000, improving how well the climate estimate is constrained, you will see that the mean comes down from 3.8 to 2.2 deg. C even though the most likely estimate does not reduce much.