Monday, December 30, 2013

This idea might get some US politicians to do their jobs

This sounds like a good idea.  Who knows?  Hitting science-denying politicians in one of the areas it hurts most (their public image) might do wonders for their ability to comprehend the science.

You can find out more about this plan (plus plenty of other information at

Thursday, December 19, 2013

A pause in global warming? What pause?!?

Much has been made of the pause in global surface temperatures since 1998.  Among the factors advanced to explain that pause include a change in ENSO, a decline in solar output, and an increase in aerosols (i.e. Foster and Rahmstorf 2011).  One of the previously neglected factors is artificial cooling in global surface temperature data sets.  Neither the HadCRUT4 nor the NOAA temperature data cover the polar regions.  As the polar regions are the fastest warming areas on Earth (UAH Arctic trend since November 1978: +0.44ºC per decade, Global trend: +0.14ºC per decade), excluding those regions leads to an artificial cooling bias in the data.

Anomaly map of HadCRUT4, showing the gaps in data coverage (in white)
Both NOAA and GISS are artificially cooled by an outdated ocean temperature data set (HadSST2) that doesn't account for the change from ship-based temperature measurements to buoy-based temperature measurements that occurred within the last 10 years (Kennedy et al. 2011).  Ship-based measurements are warmer than buoy-based measurements, meaning that the change from one to the other will create an artificial cooling trend that doesn't really exist unless the data is adjusted to account for the change.  While there's no sign as yet that GISS and NOAA will be updating their data to include the new HadSST3 that compensates for the change in how ocean temperatures are measured, a recent paper by Cowtan and Way (2013) proposes a method to correct the coverage gaps in HadCRUT4 data.

Cowtan and Way (2013) converted UAH satellite temperature data, which measures air temperature 1,000 meters above the Earth's surface, to surface temperature data (measured 2 meters above the surface) to fill in the blanks in the HadCRUT4 coverage map.  The way they did this was to take the difference between UAH data and existing surface temperature stations and interpolate the differences in surface data gaps via a method called "kriging."  Kriging is a well-established statistical technique and is used by NASA GISS as well as the Berkley Earth team for their global temperature data sets, although those teams interpolate directly from existing surface stations to cover the gaps.  Cowtan and Way's method should be more precise, as it uses satellite temperature data that covers the gaps to calculate the surface temperature in the gaps.

Covering the gaps in HadCRUT4 data has a large impact on temperature trends, especially the trends since 1998.  The trend in HadCRUT4 data since 1998 is +0.01976 ± 0.05555ºC per decade (trend ± 1σ standard error).  The trend since 1998 in the coverage-corrected version (I'll call it the "Cowtan-Way" data set) is +0.10508 ± 0.05458ºC per decade, 5.3x faster.  Once changes in ENSO, aerosols, and solar output are factored out of the Cowtan-Way data, the rate of rise since 1998 increases to +0.1880 ± 0.02765ºC per decade, 9.5x faster than the trend in HadCRUT4.

What does this mean for the "pause"?  Quite simply, there really is no pause.  The apparent "pause" is really just an artifact—the artificial effect of poor coverage of polar regions, combined with cooling from a shift in ENSO, an increase in aerosols, and a decrease in solar output.  While the rate of increase in the has been slower since 1998, an increase of +0.10508 ± 0.05458ºC per decade still an increase, not a pause.  Once the data is adjusted for ENSO, aerosols, and solar output, the rate of rise due to greenhouse gases for the Jan. 1998-Dec. 2012 is +0.1880 ± 0.02765ºC per decade—which is slightly faster than the overall rate of rise due to greenhouse gases since 1979 (+0.1841 ± 0.010282ºC per decade).  The other thing this shows is the over-sized effect small changes can have on short-term trends, as small changes in temperature can produce very different trends.  This effect is the consequence (and danger) of using short time periods.  Trends become far more stable over as the length of the time period increases.

The next time someone talks about a "pause" in global temperatures, the proper response is really "What pause?"  Global warming hasn't paused.  If anything, the rate of warming due to greenhouse gas concentrations has increased slightly since 1998.

Wednesday, December 18, 2013

It's cold in my backyard! Does that mean there's no global warming?

Since October, deniers in the US have been pointing to the colder-than-normal weather in the US to claim that global warming has stopped.  In a twist that should blow their minds, NOAA/NCDC and GISS have both announced that November 2013 was the hottest November globally in recorded history.

How can global temperatures set record highs if the US is so cold?  A temperature anomaly map from GISS answers that question nicely:

Map available from NASA GISS
Notice anything?  Yep.  Canada and the USA (particularly the Great Lakes region and Northeastern US), along with the Antarctic Peninsula are pretty much the only parts of the entire planet that were below average in November.  The rest of the Earth was average to well above average (check out Siberia!).  That's why November 2013 was the hottest November for the entire planet since at least 1880.

The same global anomaly pattern appears in satellite data as well.  Lower tropospheric UAH data, which measures air temperatures 1,000 meters above the surface, shows the following for November:

Again, eastern North America is colder than average while Eastern Europe/Western Asia, Siberia, and East Antarctica are far warmer than average.   Overlaying UAH data with GISS surface data shows the same pattern of temperature rise since 1978:

This shows is the importance of using global data to monitor global warming.  The weather in your backyard, country, or even region may be below normal—but that doesn't mean that the entire planet is also below normal.

IPCC models versus actual temperatures

One of the dominant memes among climate deniers are that climate models are inaccurate.  While true, particularly since 1998 (see Fyfe et al. 2013), that fact doesn't mean that global warming isn't happening or that global warming is due to a natural cycle and not CO2 as many deniers claim.  For those leaps of logic to be true, the entire field of radiative physics, 152 years of experiments, and 40+ years of satellite observations would all have to be wrong.  Nor does it mean that climate isn't as sensitive to changes in radiative forcing as multiple studies have shown it to be (i.e. Paleosens 2013).  What it means is far more complex.

To illustrate this complexity, I compared IPCC AR5 climate models with surface temperatures (GISS).  The AR5 models were run with four scenarios, labeled RCP 2.6, RCP 4.5, RCP 6.0, and RCP 8.5.  Data for each scenario, along with global temperature data, AMO, PDO, etc. are available at Climate Explorer.  The RCP scenarios span the range of possible temperature scenarios, from a < 2ºC rise by AD 2100 (RCP 2.6, assuming strong mitigation) up to 4ºC by AD 2100 (RCP 8.5, business as usual).  One note: All trends reported in this article compensate for autocorrelation and are trend ± 1σ standard error.

Comparing the RCP 2.6 model results with GISS temperature data (temperature anomalies relative to the 1981-2010 average) show that GISS temperatures have trended toward the bottom of the expected range over the past 10 years after being well within the expected range before.

 This result holds true for the RCP 8.5 results as well.

In each case, actual global surface temperatures have trended toward the bottom of the predicted range—but still within the predicted range.  However, it is quite a mismatch when the average of the models predicts a +0.2541 ± 0.02196ºC per decade rise in global surface temperature average (combined average of all 65 IPCC 2.6 models) between January 1998 and December 2012 whereas GISS surface temperatures show a +0.06192 ± 0.04637ºC per decade rise (January 1998 to October 2013).  So what caused that mismatch between the multi-model average and actual surface temperatures?   To understand that, we must first look at how models are constructed, tested, and then used.

The models were tested for their ability to predict twentieth century climate conditions given actual inputs such as solar output, ENSO, aerosols, and greenhouse gas levels (for an overview, see the excellent overview by the National Academy of Sciences).  Those that can successfully replicate past climate are then used to predict conditions after AD 2000, predicting not only global temperature but the inputs (greenhouse gas emissions, ENSO, solar variation, aerosol concentrations, etc.) as well.  If the models are overpredicting global temperatures, there are several sources of error to examine.

First, climate models treat changes in inputs such as changes in aerosols, a drop in solar output, changes in ENSO as random events.  Random events get cancelled out when calculating a combined average of all models as I did to get the average rate of rise. So to really compare surface temperatures to the average of a series of climate models, you must first factor out those random events.  Factoring out changes in ENSO, solar output, and aerosols (as per Foster and Rahmstorf 2011) showed that surface temperature rise January 1998 to March 2013 was +0.1498 ± 0.03151ºC per decade, still quite a bit lower than the IPCC RCP 2.6 average but far higher than the rate without factoring out those random factors.  The adjusted GISS still trends toward the bottom of the expected range of the models, albeit not as low as the unadjusted data, indicating that while ENSO, solar output, and aerosol concentrations are part of the reason for the mismatch between modeled temperature rise and actual temperature rise, they're not the only reasons.

Second, expected changes in greenhouse gas emissions could be too high.  This has been an issue for some time.  In 2000, Dr. James Hansen noted that predicted changes in greenhouse gas emissions in IPCC models were off.  His example was that the IPCC's methane concentration value was off by a factor of 3.

There is a third possibility that has been largely ignored.  All of our surface temperature data sets underestimate the trues surface temperature of the Earth.  Two of the data sets (HadCRUT4 and NCDC) do not cover the polar regions or much of Africa.  With the Arctic the fastest warming region on Earth according to UAH satellite data, that is an omission that creates a cooling bias in those data sets.

HadCRUT4 coverage map.  The areas in white are not included in the data.
GISS uses kriging to fill in the blanks on the map and therefore covers the entire globe.  However, GISS still uses an out-dated ocean temperature data set (HadSST2) in their global data set.  Over the past decade, the way ocean surface temperature is measured has changed from ship-based measurements to buoy-based measurements.  The ship-based measurements are known to be warmer than the buoy-based measurements (Kennedy et al. 2011).  HadSST2 doesn't account for that difference, creating an artificial cooling trend in ocean data over the past 10 years, which in turn creates an artificial cooling influence in GISS global data.  HadCRUT4 uses the updated HadSST3 data set, which does account for that difference.

To correct for the coverage problem in HadCRUT4, Cowtan and Way (2013) recently used UAH satellite data to fill in the coverage gaps in HadCRUT4 data.  Their method was to use kriging on the difference between surface and satellite data.  Their new data set matches surface data where surface data exists.  Where surface data does not, they used the interpolated differences to convert existing satellite data to surface data.

Filling in the gaps has a dramatic effect on the short-term trend.  Before, the January 1998-December 2012 trend in HadCRUT4 data was +0.01976 ± 0.05555ºC per decade.  After, the trend for the same time period is +0.10508 ± 0.05458ºC per decade.

Comparing the coverage-corrected HadCRUT4 to the IPCC models shows that while coverage made a noticeable difference, even the corrected HadCRUT4 trends toward the bottom of the IPCC range.

Adjusting the coverage-corrected HadCRUT4 to account for ENSO, aerosols, and solar output makes an even larger difference in the trend since 1998: +0.1880 ± 0.02765ºC per decade.  That places surface temperatures well within the range of the IPCC models, with only temperatures in 2011 and 2012 trending toward the bottom of the IPCC range.  And the rate of rise is much closer to the IPCC multi-model average of  +0.2541ºC per decade.

These analyses that I've done indicate that ENSO, aerosols, and solar output combined with cooling biases in the surface temperature data are major reasons for the mismatch since 1998 between surface temperature data and IPCC models.  However, they also indicate that those reasons are not the entire answer.  The final piece of the puzzle comes from a recent paper by Kosaka and Xie (2013).  They re-ran the IPCC AR5 models and found that the output of the models matched the slow-down in surface temperatures since 1998 when the models were given the actual values of ENSO, aerosols, solar ouput, and greenhouse gas concentrations.  James Hansen's concern about overestimated increases in greenhouse gas concentrations from 2000 may still hold true today.

Looking at the full puzzle, then, the answer for why there is an apparent mismatch between IPCC models and surface temperatures since 1998 is 1) an inability in the models to accurately predict random changes in ENSO, aerosols, and solar output, 2) artificial cooling in the surface temperature data due to poor coverage and/or changes in how surface data is collected, and 3) overestimates of greenhouse gas concentrations.  Does this mean that we shouldn't be worried about global warming because the models are wrong?  No.  All this hullabaloo over the climate models is really just arguing over how fast warming will occur over the next 100 years.  Even our current climate sensitivity value (0.8ºC/W/m2) is for a 100-year window.  However, warming won't magically stop in AD 2100.  The effects of the warming that we created will be felt for centuries, with long-term feedbacks boosting climate sensitivity to ~1.6ºC/W/m2.  For instance, the last time CO2 levels were as high as today's levels was the mid-Miocene.  Global temperatures were ~3ºC warmer than today and sea levels 25 meters higher (Tripati et al. 2009).  That is where we're headed with global warming—and we'll likely warm even further, as that is the prediction for today's greenhouse gas levels and we're adding over 2 ppmv to that each year.  The consequences of that much warming will be devastating on agriculture and the ability of humans to live in many regions of the planet.  Sea levels will continue to rise for at least another 2,000 years (Levermann et al. 2013), making many coastal areas uninhabitable.  One of the few studies to even consider warming past AD 2100 found that most of the planet could become uninhabitable due to heat stress by AD 2300 (Sherwood and Huber 2010), particularly on a business-as-usual emissions path.

Sorry to get all gloom-and-doom, but the reality is that if we don't get our act together and soon, we have basically doomed future generations to living on a planet that will be very hostile to human life.  And I care too much about my children and their future to allow that to happen without raising a ruckus about it.