Monday, December 30, 2013

This idea might get some US politicians to do their jobs

This sounds like a good idea.  Who knows?  Hitting science-denying politicians in one of the areas it hurts most (their public image) might do wonders for their ability to comprehend the science.


 
You can find out more about this plan (plus plenty of other information at http://climatenamechange.org.

Thursday, December 19, 2013

A pause in global warming? What pause?!?

Much has been made of the pause in global surface temperatures since 1998.  Among the factors advanced to explain that pause include a change in ENSO, a decline in solar output, and an increase in aerosols (i.e. Foster and Rahmstorf 2011).  One of the previously neglected factors is artificial cooling in global surface temperature data sets.  Neither the HadCRUT4 nor the NOAA temperature data cover the polar regions.  As the polar regions are the fastest warming areas on Earth (UAH Arctic trend since November 1978: +0.44ºC per decade, Global trend: +0.14ºC per decade), excluding those regions leads to an artificial cooling bias in the data.

Anomaly map of HadCRUT4, showing the gaps in data coverage (in white)
Both NOAA and GISS are artificially cooled by an outdated ocean temperature data set (HadSST2) that doesn't account for the change from ship-based temperature measurements to buoy-based temperature measurements that occurred within the last 10 years (Kennedy et al. 2011).  Ship-based measurements are warmer than buoy-based measurements, meaning that the change from one to the other will create an artificial cooling trend that doesn't really exist unless the data is adjusted to account for the change.  While there's no sign as yet that GISS and NOAA will be updating their data to include the new HadSST3 that compensates for the change in how ocean temperatures are measured, a recent paper by Cowtan and Way (2013) proposes a method to correct the coverage gaps in HadCRUT4 data.

Cowtan and Way (2013) converted UAH satellite temperature data, which measures air temperature 1,000 meters above the Earth's surface, to surface temperature data (measured 2 meters above the surface) to fill in the blanks in the HadCRUT4 coverage map.  The way they did this was to take the difference between UAH data and existing surface temperature stations and interpolate the differences in surface data gaps via a method called "kriging."  Kriging is a well-established statistical technique and is used by NASA GISS as well as the Berkley Earth team for their global temperature data sets, although those teams interpolate directly from existing surface stations to cover the gaps.  Cowtan and Way's method should be more precise, as it uses satellite temperature data that covers the gaps to calculate the surface temperature in the gaps.

Covering the gaps in HadCRUT4 data has a large impact on temperature trends, especially the trends since 1998.  The trend in HadCRUT4 data since 1998 is +0.01976 ± 0.05555ºC per decade (trend ± 1σ standard error).  The trend since 1998 in the coverage-corrected version (I'll call it the "Cowtan-Way" data set) is +0.10508 ± 0.05458ºC per decade, 5.3x faster.  Once changes in ENSO, aerosols, and solar output are factored out of the Cowtan-Way data, the rate of rise since 1998 increases to +0.1880 ± 0.02765ºC per decade, 9.5x faster than the trend in HadCRUT4.


What does this mean for the "pause"?  Quite simply, there really is no pause.  The apparent "pause" is really just an artifact—the artificial effect of poor coverage of polar regions, combined with cooling from a shift in ENSO, an increase in aerosols, and a decrease in solar output.  While the rate of increase in the has been slower since 1998, an increase of +0.10508 ± 0.05458ºC per decade still an increase, not a pause.  Once the data is adjusted for ENSO, aerosols, and solar output, the rate of rise due to greenhouse gases for the Jan. 1998-Dec. 2012 is +0.1880 ± 0.02765ºC per decade—which is slightly faster than the overall rate of rise due to greenhouse gases since 1979 (+0.1841 ± 0.010282ºC per decade).  The other thing this shows is the over-sized effect small changes can have on short-term trends, as small changes in temperature can produce very different trends.  This effect is the consequence (and danger) of using short time periods.  Trends become far more stable over as the length of the time period increases.

The next time someone talks about a "pause" in global temperatures, the proper response is really "What pause?"  Global warming hasn't paused.  If anything, the rate of warming due to greenhouse gas concentrations has increased slightly since 1998.

Wednesday, December 18, 2013

It's cold in my backyard! Does that mean there's no global warming?

Since October, deniers in the US have been pointing to the colder-than-normal weather in the US to claim that global warming has stopped.  In a twist that should blow their minds, NOAA/NCDC and GISS have both announced that November 2013 was the hottest November globally in recorded history.


How can global temperatures set record highs if the US is so cold?  A temperature anomaly map from GISS answers that question nicely:

Map available from NASA GISS
Notice anything?  Yep.  Canada and the USA (particularly the Great Lakes region and Northeastern US), along with the Antarctic Peninsula are pretty much the only parts of the entire planet that were below average in November.  The rest of the Earth was average to well above average (check out Siberia!).  That's why November 2013 was the hottest November for the entire planet since at least 1880.

The same global anomaly pattern appears in satellite data as well.  Lower tropospheric UAH data, which measures air temperatures 1,000 meters above the surface, shows the following for November:


Again, eastern North America is colder than average while Eastern Europe/Western Asia, Siberia, and East Antarctica are far warmer than average.   Overlaying UAH data with GISS surface data shows the same pattern of temperature rise since 1978:


This shows is the importance of using global data to monitor global warming.  The weather in your backyard, country, or even region may be below normal—but that doesn't mean that the entire planet is also below normal.

IPCC models versus actual temperatures

One of the dominant memes among climate deniers are that climate models are inaccurate.  While true, particularly since 1998 (see Fyfe et al. 2013), that fact doesn't mean that global warming isn't happening or that global warming is due to a natural cycle and not CO2 as many deniers claim.  For those leaps of logic to be true, the entire field of radiative physics, 152 years of experiments, and 40+ years of satellite observations would all have to be wrong.  Nor does it mean that climate isn't as sensitive to changes in radiative forcing as multiple studies have shown it to be (i.e. Paleosens 2013).  What it means is far more complex.

To illustrate this complexity, I compared IPCC AR5 climate models with surface temperatures (GISS).  The AR5 models were run with four scenarios, labeled RCP 2.6, RCP 4.5, RCP 6.0, and RCP 8.5.  Data for each scenario, along with global temperature data, AMO, PDO, etc. are available at Climate Explorer.  The RCP scenarios span the range of possible temperature scenarios, from a < 2ºC rise by AD 2100 (RCP 2.6, assuming strong mitigation) up to 4ºC by AD 2100 (RCP 8.5, business as usual).  One note: All trends reported in this article compensate for autocorrelation and are trend ± 1σ standard error.

Comparing the RCP 2.6 model results with GISS temperature data (temperature anomalies relative to the 1981-2010 average) show that GISS temperatures have trended toward the bottom of the expected range over the past 10 years after being well within the expected range before.


 This result holds true for the RCP 8.5 results as well.


In each case, actual global surface temperatures have trended toward the bottom of the predicted range—but still within the predicted range.  However, it is quite a mismatch when the average of the models predicts a +0.2541 ± 0.02196ºC per decade rise in global surface temperature average (combined average of all 65 IPCC 2.6 models) between January 1998 and December 2012 whereas GISS surface temperatures show a +0.06192 ± 0.04637ºC per decade rise (January 1998 to October 2013).  So what caused that mismatch between the multi-model average and actual surface temperatures?   To understand that, we must first look at how models are constructed, tested, and then used.

The models were tested for their ability to predict twentieth century climate conditions given actual inputs such as solar output, ENSO, aerosols, and greenhouse gas levels (for an overview, see the excellent overview by the National Academy of Sciences).  Those that can successfully replicate past climate are then used to predict conditions after AD 2000, predicting not only global temperature but the inputs (greenhouse gas emissions, ENSO, solar variation, aerosol concentrations, etc.) as well.  If the models are overpredicting global temperatures, there are several sources of error to examine.

First, climate models treat changes in inputs such as changes in aerosols, a drop in solar output, changes in ENSO as random events.  Random events get cancelled out when calculating a combined average of all models as I did to get the average rate of rise. So to really compare surface temperatures to the average of a series of climate models, you must first factor out those random events.  Factoring out changes in ENSO, solar output, and aerosols (as per Foster and Rahmstorf 2011) showed that surface temperature rise January 1998 to March 2013 was +0.1498 ± 0.03151ºC per decade, still quite a bit lower than the IPCC RCP 2.6 average but far higher than the rate without factoring out those random factors.  The adjusted GISS still trends toward the bottom of the expected range of the models, albeit not as low as the unadjusted data, indicating that while ENSO, solar output, and aerosol concentrations are part of the reason for the mismatch between modeled temperature rise and actual temperature rise, they're not the only reasons.



Second, expected changes in greenhouse gas emissions could be too high.  This has been an issue for some time.  In 2000, Dr. James Hansen noted that predicted changes in greenhouse gas emissions in IPCC models were off.  His example was that the IPCC's methane concentration value was off by a factor of 3.

There is a third possibility that has been largely ignored.  All of our surface temperature data sets underestimate the trues surface temperature of the Earth.  Two of the data sets (HadCRUT4 and NCDC) do not cover the polar regions or much of Africa.  With the Arctic the fastest warming region on Earth according to UAH satellite data, that is an omission that creates a cooling bias in those data sets.

HadCRUT4 coverage map.  The areas in white are not included in the data.
GISS uses kriging to fill in the blanks on the map and therefore covers the entire globe.  However, GISS still uses an out-dated ocean temperature data set (HadSST2) in their global data set.  Over the past decade, the way ocean surface temperature is measured has changed from ship-based measurements to buoy-based measurements.  The ship-based measurements are known to be warmer than the buoy-based measurements (Kennedy et al. 2011).  HadSST2 doesn't account for that difference, creating an artificial cooling trend in ocean data over the past 10 years, which in turn creates an artificial cooling influence in GISS global data.  HadCRUT4 uses the updated HadSST3 data set, which does account for that difference.

To correct for the coverage problem in HadCRUT4, Cowtan and Way (2013) recently used UAH satellite data to fill in the coverage gaps in HadCRUT4 data.  Their method was to use kriging on the difference between surface and satellite data.  Their new data set matches surface data where surface data exists.  Where surface data does not, they used the interpolated differences to convert existing satellite data to surface data.

Filling in the gaps has a dramatic effect on the short-term trend.  Before, the January 1998-December 2012 trend in HadCRUT4 data was +0.01976 ± 0.05555ºC per decade.  After, the trend for the same time period is +0.10508 ± 0.05458ºC per decade.

From http://www-users.york.ac.uk/~kdc3/papers/coverage2013/background.html
Comparing the coverage-corrected HadCRUT4 to the IPCC models shows that while coverage made a noticeable difference, even the corrected HadCRUT4 trends toward the bottom of the IPCC range.


Adjusting the coverage-corrected HadCRUT4 to account for ENSO, aerosols, and solar output makes an even larger difference in the trend since 1998: +0.1880 ± 0.02765ºC per decade.  That places surface temperatures well within the range of the IPCC models, with only temperatures in 2011 and 2012 trending toward the bottom of the IPCC range.  And the rate of rise is much closer to the IPCC multi-model average of  +0.2541ºC per decade.


These analyses that I've done indicate that ENSO, aerosols, and solar output combined with cooling biases in the surface temperature data are major reasons for the mismatch since 1998 between surface temperature data and IPCC models.  However, they also indicate that those reasons are not the entire answer.  The final piece of the puzzle comes from a recent paper by Kosaka and Xie (2013).  They re-ran the IPCC AR5 models and found that the output of the models matched the slow-down in surface temperatures since 1998 when the models were given the actual values of ENSO, aerosols, solar ouput, and greenhouse gas concentrations.  James Hansen's concern about overestimated increases in greenhouse gas concentrations from 2000 may still hold true today.

Looking at the full puzzle, then, the answer for why there is an apparent mismatch between IPCC models and surface temperatures since 1998 is 1) an inability in the models to accurately predict random changes in ENSO, aerosols, and solar output, 2) artificial cooling in the surface temperature data due to poor coverage and/or changes in how surface data is collected, and 3) overestimates of greenhouse gas concentrations.  Does this mean that we shouldn't be worried about global warming because the models are wrong?  No.  All this hullabaloo over the climate models is really just arguing over how fast warming will occur over the next 100 years.  Even our current climate sensitivity value (0.8ºC/W/m2) is for a 100-year window.  However, warming won't magically stop in AD 2100.  The effects of the warming that we created will be felt for centuries, with long-term feedbacks boosting climate sensitivity to ~1.6ºC/W/m2.  For instance, the last time CO2 levels were as high as today's levels was the mid-Miocene.  Global temperatures were ~3ºC warmer than today and sea levels 25 meters higher (Tripati et al. 2009).  That is where we're headed with global warming—and we'll likely warm even further, as that is the prediction for today's greenhouse gas levels and we're adding over 2 ppmv to that each year.  The consequences of that much warming will be devastating on agriculture and the ability of humans to live in many regions of the planet.  Sea levels will continue to rise for at least another 2,000 years (Levermann et al. 2013), making many coastal areas uninhabitable.  One of the few studies to even consider warming past AD 2100 found that most of the planet could become uninhabitable due to heat stress by AD 2300 (Sherwood and Huber 2010), particularly on a business-as-usual emissions path.

Sorry to get all gloom-and-doom, but the reality is that if we don't get our act together and soon, we have basically doomed future generations to living on a planet that will be very hostile to human life.  And I care too much about my children and their future to allow that to happen without raising a ruckus about it.

Sunday, October 27, 2013

Enough hockey sticks for a team

One of the persistent denier myths is that the Hockey Stick (usually meaning Mann, M. E., R. S, Bradley, and M. K. Hughes. 1999. Northern Hemispheric Temperatures During the Past Millenium: Inferences, Uncertainties, and Limitations. Geophysical Research Letters. 26:759-762) has been discredited.  Not only is that myth false but Mann et al. (1999) has been validated through the publication of numerous hockey stick graphs since 1999.  Here is a brief list of the ones I know:

Crowley, T. J. 2000. Causes of Climate Change Over the Past 1000 Years. Science 289:270-277: Used both his own and Mann et al. (1999)'s hockey sticks to examine the cause of temperature changes over the past 1,000 years.  Found that natural forcings could not explain twentieth century warming without the effect of greenhouse gases.

Huang, S, H. N. Pollack, and P. Shen. 2000. Temperature Trends over the past five centuries reconstructed from borehole temperatures. Nature 403:756-758: Reconstructed global average temperatures since AD 1500 using temperature data from 616 boreholes from around the globe.

Bertrand, C., M. Loutre, M. Crucifix, and A. Berger. 2002. Climate of the Last Millenium: A Sensitivity Study. Tellus 54A:221-244.: Reconstructed solar output, volcanic activity, land use changes, and greenhouse gas concentrations since AD 1000, then computed the expected temperature changes due to those forcings.  Compared the computed temperature changes with two independent temperature reconstructions.

Esper, J., E. R. Cook, and F. H. Schweingruber. 2002. Low-frequency Signals in Long Tree-ring Chronologies for Reconstructing Past Temperature Variability. Science 295:2250-2253: Reconstructed Northern Hemisphere temperatures between AD 800 and AD 2000 using tree ring chronologies.

Cronin, T. M., G. S. Dwyer, T. Kamiya, S. Schwede, and D. A. Willard. 2003. Medieval Warm Period, Little Ice Age and 20th Century Temperature Variability from Chesapeake Bay. Global and Planetary Change 36: 17-29: Reconstructed temperatures between 200 BC and AD 2000 around Chesapeake Bay, USA, using sediment core records.

Pollack, H. N. and J. E. Smerdon. 2004. Borehole Climate Reconstructions: Spatial Structure and Hemispheric Averages. Journal of Geophysical Research 109:D11106: Reconstructed global average temperatures since AD 1500 using temperature data from 695 boreholes from around the globe.

Esper, J., R. J. S. Wilson, D. C. Frank, A. Moberg, H. Wanner, and J. Luterbacher. 2005. Climate: Past Ranges and Future Changes. Quarternary Science Reviews 24:2164-2166: Compared and averaged five independent reconstructions of Northern Hemisphere temperatures from AD 1000 to AD 2000.

Moberg, A., D. M. Sonechkin, K. Holmgren, N. M. Datsenko, and W. Karlen. 2005. Highly Variable Northern Hemisphere Temperatures Reconstructed from Low- and High-resolution Proxy Data. Nature 433:613-617: Combined tree ring proxies with glacial ice cores, stalagmite, and lake sediment proxies to reconstruct Northern Hemisphere temperatures from AD 1 to AD 2000.

Oerlemans, J. 2005. Extracting a Climate Signal from 169 Glacier Records. Science 308:675-677: Reconstructed global temperatures from AD 1500 to AD 2000 using 169 glacial ice proxies from around the globe.

Rutherford, S., M. E. Mann, T. J. Osborn, R. S. Bradley, K. R. Briffa, M. K. Hughes, and P. D. Jones. 2005. Proxy-based Northern Hemisphere Surface Temperature Reconstructions: Sensitivity to Method, Predictor Network, Target Season, and Target Domain. Journal of Climate 18:2308-2329: Compared two multi-proxy temperature reconstructions and tested the results of each reconstruction for sensitivity to type of statistics used, proxy characteristics, seasonal variation, and geographic location.  Concluded that the reconstructions were robust to various sources of error.

D'Arrigo, R. R. Wilson, and G. Jacoby. 2006. On the Long-term Context for Late Twentieth Century Warming. Journal of Geophysical Research 111:D03103: Reconstructed Northern Hemisphere temperatures between AD 700 and AD 2000 from multiple tree ring proxies using a new statistical technique called Regional Curve Standardization.  Concluded that their new technique was superior to the older technique used by previous reconstructions.

Osborn, T. J. and K. R. Briffa. 2006. The Spatial Extent of 20th-century Warmth in the Context of the Past 1200 Years. Science 841-844: Used 14 regional temperature reconstructions between AD 800 and AD 2000 to compare spatial extent of changes in Northern Hemisphere temperatures.  Found that twentieth century warming was more widespread than any other temperature change of the past 1,200 years.

Hegerl, G. C., T. J. Crowley, M. Allen, W. T. Hyde, H. N. Pollack, J. Smerdon, and E. Zorita. 2007. Detection of Human Influence on a New, Validated 1500-year Temperature Reconstruction. Journal of Climate 20:650-666: Combined borehole temperatures and tree ring proxies to reconstruct Northern Hemisphere temperatures over the past 1,450 years.  Introduced a new calibration technique between proxy temperatures and instrumental temperatures.

Juckes, M. N., M. R. Allen, K. R. Briffa, J. Esper, G. C. Hegerl, A. Moberg, T. J. Osborn, and S. L. Weber. 2007. Millenial Temperature Reconstruction Intercomparison and Evaluation. Climate of the Past 3:591-609: Combined multiple older reconstructions into a meta-analysis.  Also used existing proxies to calculate a new Northern Hemisphere temperature reconstruction.

Wahl, E. R. and C. M. Ammann. 2007. Robustness of the Mann, Bradley, Hughes Reconstruction of Northern Hemisphere Surface Temperatures: Examination of Criticisms Based on the Nature and Processing of Proxy Climate Evidence. Climatic Change 85:33-69: Used the tree ring proxies, glacial proxies, and borehole proxies used by Mann et al. (1998, 1999) to recalculate Northern Hemisphere temperatures since AD 800.  Refuted the McIntyre and McKitrick criticisms and showed that those criticisms were based on flawed statistical techniques.

Wilson, R., R. D'Arrigo, B. Buckley, U. Büntgen, J. Esper, D. Frank, B. Luckman, S. Payette, R. Vose, and D. Youngblut. 2007. A Matter of Divergence: Tracking Recent Warming at Hemispheric Scales Using Tree Ring Data. Journal of Geophysical Research 112:D17103: Reconstructed Northern Hemisphere temperatures from AD 1750 to AD 2000 using tree ring proxies that did not show a divergence problem after AD 1960.

Mann, M. E., Z. Zhang, M. K. Hughes, R. S. Bradley, S. K. Miller, S. Rutherford, and F. Ni. 2008. Proxy-based Reconstructions of Hemispheric and Global Surface Temperature Variations over the Past Two Millenium. Proceedings of the National Academy of Sciences 105:13252-13257:  Reconstructed global temperatures between AD 200 and AD 2000 using 1,209 independent proxies ranging from tree rings to boreholes to sediment cores to stalagmite cores to Greenland and Antarctic ice cores.

Kaufman, D. S., D. P. Schneider, N. P. McKay, C. M. Ammann, R. S. Bradley, K. R. Briffa, G. H. Miller, B. L. Otto-Bliesner, J. T. Overpeck, B. M. Vinther, and Arctic Lakes 2k Project Members. 2009. Recent Warming Reverses Long-term Arctic Cooling. Science 325:1236-1239: Used tree rings, lake sediment cores, and glacial ice cores to reconstruct Arctic temperatures between 1 BC and 2000 AD.

von Storch, H., E. Zorita, and F. González-Rouco. 2009. Assessment of Three Temperature Reconstruction Methods in the Virtual Reality of a Climate Simulation. International Journal of Earth Science 98:67-82: Tested three different temperature reconstruction techniques to show that the Composite plus Scaling method was better than the other two methods.

Frank, D., J. Esper, E. Zorita, and R. Wilson. 2010. A Noodle, Hockey Stick, and Spaghetti Plate: A Perspective on High-resolution Paleoclimatology. Climate Change 1:507-516: A brief history of proxy temperature reconstructions, as well as analysis of the main questions remaining in temperature reconstructions.

Kellerhals, T., S. Brütsch, M. Sigl, S. Knüsel, H. W. Gäggeler, and M. Schwikowski. 2010. Ammonium Concentration in Ice Cores: A New Proxy for Regional Reconstruction? Journal of Geophysical Research 115:D16123: Used ammonium concentration in a glacial ice core to reconstruct tropical South American temperatures over the past 1,600 years.

Ljungqvist, F. C. 2010. A New Reconstruction of Temperature Variability in the Extra-tropical Northern Hemisphere During the Last Two Millenia. Geografiska Annaler: Series A Physical Geography 92:339-351  : Reconstructed extra-tropical Northern Hemisphere temperatures from AD 1 to AD 2000 using historical records, sediment cores, tree rings, and stalagmites.

Thibodeau, B., A. de Vernal, C. Hillaire-Marcel, and A. Mucci. 2010. Twentieth Century Warming in Deep Waters of the Gulf of St. Lawrence: A Unique Feature of the Last Millenium. Geophysical Research Letters 37:L17604: Reconstructed temperatures at the bottom of the Gulf of St. Lawrence since AD 1000 via sediment cores.

Tingley, M. P. and P. Huybers. 2010. A Bayesian Algorithm for Reconstructing Climate Anomalies in Space and Time. Part I: Development and Application to Paleoclimate Reconstruction Problems. Journal of Climate 23:2759-2781.

Tingley, M. P. and P. Huybers. 2010. A Bayesian Algorithm for Reconstructing Climate Anomalies in Space and Time. Part II: Comparison with the Regularized Expectation Maximum Algorithm. Journal of Climate 23:2782-2800.  Both Tingley and Huybers papers revolved around the same reconstruction, in which they derived and used a Bayesian approach to reconstruct North American temperatures.

Büntgen, U., W. Tegel, K. Nicolussi, M. McCormick, D. Frank, V. Trouet, J. O. Kaplan, F. Herzig, K. Heussner, H. Wanner, J. Luterbacher, and J. Esper. 2011. 2500 Years of European Climate Variability and Human Susceptibility. Science 331:578-582:  Used tree ring proxies to reconstruct Central European temperatures between 500 BC and AD 2000.

Kemp, A. C., B. P. Horton, J. P. Donnelly, M. E. Mann, M. Vermeer, and S. Rahmstorf. 2011. Climate Related Sea-level Variations Over the Past Two Millenia. Proceedings of the National Academy of Sciences 108:11017-11022: Reconstructed sea levels off North Carolina, USA from 100 BC to AD 2000 using sediment cores.  They also showed that sea levels changed with global temperature for at least the past millennium.

Kinnard, C. C. M. Zdanowicz, D. A. Fisher, E. Isaksson, A. de Vernal, and L. G. Thompson. 2011. Reconstructed Changes in Arctic Sea Ice Over the Past 1,450 Years. Nature 479:509-512: Used multiple proxies to reconstruct late summer Arctic sea ice between AD 561 and AD 1995, using instrumental data to extend their record to AD 2000.

Martín-Chivelet, J., M. B. Muñoz-García, R. L. Edwards, M. J. Turrero, and A. L. Ortega. 2011. Land Surface Temperature Changes in Northern Iberia Since 4000 yr BP, Based on δ13C of Speleothems. Global and Planetary Change 77:1-12: Reconstructed temperatures in the Iberian Peninsula from 2000 BC to AD 2000 using stalagmites.

Spielhagen, R. F., K. Werner, S. A. Sørensen, K. Zamelczyk, E. Kandiano, G. Budeus, K. Husum, T. M. Marchitto, and M. Hald. 2011. Enhanced Modern Heat Transfer to the Arctic by Warm Atlantic Water. Science 331:450-453 : Reconstructed marine temperatures in the Fram Strait from 100 BC to AD 2000 using sediment cores.

Esper et al. 2012: Used tree ring proxies to reconstruct Northern Scandinavian temperatures 100 BC to AD 2000.  May have solved the post-AD 1960 tree ring divergence problem.

Ljungqvist et al. 2012: Used a network of 120 tree ring proxies, ice core proxies, pollen records, sediment cores, and historical documents to reconstruct Northern Hemisphere temperatures between AD 800 and AD 2000, with emphasis on proxies recording the Medieval Warm Period.

Melvin, T. M., H. Grudd, and K. R. Briffa. 2012. Potential Bias in 'Updating' Tree-ring Chronologies Using Regional Curve Standardisation: Re-processing 1500 Years of Torneträsk Density and Ring-width Data. The Holocene 23:364-373: Reanalyzed tree ring data for the Torneträsk region of northern Sweden.

Abram, N. J., R. Mulvaney, E. W. Wolff, J. Triest, S. Kipfstuhl, L. D. Trusel, F. Vimeux, L. Fleet, and C. Arrowsmith. 2013. Acceleration of Snow Melt in an Antarctic Peninsula Ice Core During the Twentieth Century. Nature Geoscience 6:404-411: Reconstructed snow melt records and temperatures in the Antarctic Peninsula since AD 1000 using ice core records.

Marcott, S. A., J. D. Shakun, P. U. Clark, and A. C. Mix. 2013. A Reconstruction of Regional and Global Temperature for the Past 11,300 Years. Science 339:1198-1201: Reconstructed global temperatures over the past 11,000 years using sediment cores.  Data ended at AD 1940.

PAGES 2k Consortium. 2013. Continental-scale Temperature Variability During the Past Two Millennia. Nature Geoscience 6:339-346: Used multiple proxies (tree rings, sediment cores, ice cores, stalagmites, pollen, etc) to reconstruct regional and global temperatures since AD 1.

Rohde, R., R. A. Muller, R. Jacobsen, E. Muller, S. Perimutter, A. Rosenfeld, J. Wurtele, D. Groom, and C. Wickham. 2013. A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011. Geoinformatics and Geostatistics: An Overview 1:1-7: Used proxy and instrumental records to reconstruct global temperatures from AD 1753 to AD 2011.

Wilson, R.,  K. Anchukaitis, K. R. Briffa, U. Büntgen, E. Cook, R. D'Arrigo, N. Davi, J. Esper, D. Frank, B. Gunnarson, G. Hegerl, S. Helama, S. Klesse, P. J. Krusic, H. W. Linderholm, V. Myglan, T. J. Osborn, M. Rydval, L. Schneider, A. Schurer, G. Wiles, P. Zhang, and E. Zorita. 2016. Last Millennium Northern Hemisphere Summer Temperatures from Tree rings: Part I: The Long Term Context. Quarternary Science Reviews 134:1-18. Introduces and details the new N-TREND2015 temperature reconstruction using 54 proxy records.

The proper response to someone who asserts that the Hockey Stick has been falsified is to ask "Which one?"  As for what most of the temperature reconstructions show, the data from Marcott et al. (2013) combined with 30-year smoothed HadCRUT4 data is fairly representative:


Update:  I've posted two lengthy responses rebutting "Anonymous" in the comments.  Quite frankly, none of "his" numerous claims stand up to scrutiny.  Part 1, Part 2.

Sunday, October 20, 2013

How to spot outliers in regression analysis

Much of the debate over the possible pause in surface temperatures since 1998 really hinges on 1998 being an outlier.  And not only an outlier but an influential data point, which means that its very presence changes the overall regression trend.  In this post, I'll show how to identify outliers, high-leverage data points, and influential data points.

First, some basic definitions.  An outlier is any data point that falls outside the normal range for that data set, usually set as being 2 standard deviations from the average.  In regression analyses, an outlier is any data point where its residual falls outside the normal range.  High leverage data points are made at extreme values for the independent variables such that there are few other data points around, regardless of whether or not those data points change the overall trend.  An influential data point is an extreme outlier with high leverage that alters the overall trend.

Now for the analysis, starting with the basics.  First, create the regression model, using the subset argument to limit the time period.

Model=lm(UAH~Time, data=Climate, subset=Time>=1998)
summary(Model)
Call:
lm(formula = UAH ~ Time, data = Climate, subset = Year.1 >=
    1998)

Residuals:
     Min       1Q   Median       3Q      Max
-0.47575 -0.11244  0.01165  0.09604  0.53415

Coefficients:
                      Estimate      Std. Error    t value    Pr(>|t|)
(Intercept)   -11.176493    5.631428    -1.985     0.0487 *
Time            0.005656     0.002808     2.015     0.0454 *
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1741 on 186 degrees of freedom
Multiple R-squared: 0.02135,    Adjusted R-squared: 0.01609
F-statistic: 4.059 on 1 and 186 DF,  p-value: 0.04539
par(mfrow=c(2,2)  #Create a 2 x 2 plot matrix
plot(Model)  #Default diagnostic plots 



The default diagnostic plots reveal a good fit except for several months during 1998, which is especially obvious in the Residual vs Fitted plot.  Digging deeper requires more tools than the default offers, which the car package (Companion to Applied Regression) offers.
install.packages(car)
library(car)
influencePlot(Model, id.method="identify", main="Influence Plot", sub="Circle size is proportional to Cook's Distance")


Standardized residuals above 2 and below -2 are outliers.  Points with Hat-Values above 0.025 and standardized residuals between -2 and 2 are high leverage points.  Data points with Hat-Values above 0.025 and standardized residuals above 2 and below -2 are influential points that significantly alter the overall trend.  According to this analysis, there are multiple outliers, with two influential points and one boarderline.  In R, the code I gave will allow you to directly identify the points by clicking on them.  The two influential points are numbers 2989 and 2990 which can then be pulled out of the main data frame.
Climate[2989,]
          Year  Month Time  GISS  UAH   HadCRUT4   NCDC   HadSST3    RSS    PDO    AMO
2989  1998     1       1998     0.6     0.47       0.488          0.5967     0.419       0.55     0.83      0.15
            MEI    Sunspots      TSI         AOD    Arctic.Ice    Antarctic.Ice    CO2
          2.483       31.9        1365.913  0.004      14.67             4.46            365.18
Climate[2990,]
          Year    Month    Time     GISS   UAH   HadCRUT4   NCDC   HadSST3   RSS    PDO   AMO
2990   1998     2        1998.08   0.86     0.65         0.754        0.8501       0.478      0.736  1.56    0.311
           MEI     Sunspots      TSI         AOD     Arctic.Ice   Antarctic.Ice    CO2
          2.777       40.3        1365.808   0.0037       15.7          2.99             365.98
The boarderline point is 2992.
Climate[2992,]
          Year   Month    Time      GISS   UAH   HadCRUT4   NCDC   HadSST3    RSS    PDO   AMO  
2992  1998     4         1998.25   0.62     0.66       0.621          0.7371      0.489       0.857    1.27    0.315
            MEI    Sunspots      TSI          AOD      Arctic.Ice    Antarctic.Ice     CO2
           2.673     53.4        1365.928   0.0031         14.84             6.85           368.61
Now that those points are identified, we can determine how much they influence the trend by rerunning the analysis while excluding those points.
Climate.ex = Climate[c(-2989, -2990, -2992),]
Model.2 = lm(UAH~Time, data=Climate.ex, subset=Time>=1998)
summary(Model.2)
Call:
lm(formula = UAH ~ Time, data = Climate.ex, subset = Time >= 1998)

Residuals:
     Min       1Q   Median       3Q      Max
-0.45049 -0.11412  0.01253  0.08884  0.46394

Coefficients:
                    Estimate        Std. Error    t value    Pr(>|t|)  
(Intercept)   -17.174211   5.429297     -3.163     0.00183 **
Time              0.008642    0.002707      3.193     0.00166 **
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1639 on 183 degrees of freedom
Multiple R-squared: 0.05277,    Adjusted R-squared: 0.0476
F-statistic:  10.2 on 1 and 183 DF,  p-value: 0.001658

Excluding just those three points raised the estimated trend from 0.005656ºC per year to 0.008642ºC, showing that those three outliers artificially tilted the trend toward showing less warming and demonstrating the main problem with either starting or ending regression trends with extreme outliers.

A sharp observer may ask why I did not use a generalized least squares (gls) analysis from the nlme package to factor out autocorrelation for my examples.  The reason is simply that the tools in the car package are not built to handle gls analyses.  Additionally, autocorrelation does not really matter for identifying outliers, high-leverage points, and influential points.

Friday, October 18, 2013

A primer on the greenhouse effect

Looking back at my first posts, I realized that I had neglected to explain the greenhouse effect.  This post is intended to rectify that omission.

First a basic principle: All energy that enters or leaves the Earth's atmosphere must be in the form of radiation.  And yes, that includes heat.  There's no atmosphere in space so heat cannot be lost from the planet via conduction and convection.  The general process is as follows.

1) Energy from the sun (including ultraviolet and visible radiation) enters the atmosphere.

2) About 30% of the sun's radiation is reflected back into space by aerosols in the air (produced by volcanoes and coal-fired power plants) or via snow, ice, and other light-colored surfaces.  Of the 70% that reaches the ground, most is visible light as most ultraviolet is absorbed by the ozone layer.

3) When visible light reaches the ground, the energy is absorbed by the surfaces on the ground.  That absorbed energy is reradiated as infrared radiation (also known as heat).  This is how your car heats up on a sunny day, especially in the summer, as the dashboard, steering wheel, and seats all absorb visible light and reradiates that energy as heat.

4) The rate at which the absorbed energy is reradiated is determined by concentration gradients.  The greater the gradient, the faster heat is lost, as we all have experienced.  You lose far more heat when you go outside without a jacket at 10ºC (50ºF) than you do at 25ºC (77ºF) because the gradient between your skin temperature and the air is far greater at an air temperature of 10ºC than at 25ºC.

5) Infrared is absorbed by tri-atomic molecules (H2O, CO2, NO2, O3) or higher (CH4) in the atmosphere, which then reradiate that energy in all directions, including back toward the Earth's surface.  Diatomic molecules (N2, O2) are invisible to infrared radiation and so do not have any role in the greenhouse effect.

6) The infrared that is reradiated back to the surface warms the Earth by decreasing the concentration gradient between the surface of the Earth (including the surface of the oceans) and the air just above the surface.  This slows the rate at which heat is lost from the surface, causing the surface to retain more of the heat and raising the temperature.

7) Eventually, the infrared makes it back out of the atmosphere into space.

While the term "greenhouse effect" is better known, I personally think a better analogy is "blanket effect" as in reality the greenhouse effect keeps the Earth warm in a similar way to the way a blanket keeps a person warm.  A blanket does not produce any heat on its own but merely absorbs a person's radiated heat and then reradiates that heat in all directions, including back toward the skin.  That decreases the temperature gradient between the skin and the surrounding air, causing less heat to escape from the skin and more of that heat to be retained inside the body.

For the Earth to maintain thermodynamic equilibrium, the amount of infrared lost must equal the amount of solar energy that reaches the surface.  If there is an imbalance, the Earth will either warm if the imbalance is positive or cool if the imbalance is negative.

Detailed diagram of the greenhouse effect from Trenberth et al. 2009

Tuesday, October 8, 2013

On the failure of US scientific education.

Listening to the public discourse in the US, one cannot help but think that basic science education in this country has failed.  Oh, sure, we have good science teachers (and bad), and textbooks filled with knowledge, but as a nation we have utterly failed to grasp the most fundamental lessons of science.  And I think that reflects poorly on scientists and science educators (myself included).

The first lesson we have failed to impart is that people must know scientific facts.  "Fact" in science means data as revealed through experiments and observations.  In essence, we have failed to teach the data.  It's much easier and faster to present the theories as in the textbook with a few supporting facts, especially given the limited time to cover any one topic in most general education science courses.  And for most topics (i.e. sliding filament theory of muscle contraction, optimal foraging theory, germ theory, general theory of relativity, etc), that is sufficient.  However, for evolution and climate change, that approach is insufficient.

The reason is simple.  There is a lot of misinformation about the basic facts about both climate change and evolution.  People honestly believe that CO2 is not a greenhouse gas, that adding more CO2 won't affect climate, that volcanoes produce more CO2 than human technology, that humans coexisted with dinosaurs, that all geologic strata were laid down in one calendar year, that evolution cannot happen, and that the radiometric decay is variable.  Combating that sort of misinformation requires starting at the basic facts, even if it means reviewing in detail facts discovered centuries earlier, i.e. superimposition (1669), faunal succession (1799),  the greenhouse effect (1820s), index fossils (1830s), the laws of thermodynamics (1824-1912), CO2 is a greenhouse gas (1861), the Stefan-Boltzmann law (1870s), etc.

The second lesson that, in my opinion, we've failed to impart is that nothing happens by magic.  There is always a physical cause.  I'm most familiar with magical thinking about the current global warming.  One common example is a claim that global warming is due to natural cycles.  What makes that claim "magical" thinking?  First, citing "natural cycles" without specifying exactly which natural cycle is the cause simply means that you don't really have any cause.  Second, there's no evidence that natural cycles are sufficient to cause the current global warming and multiple published papers that show that natural cycles aren't sufficient (i.e. Meehl et al. 2004).  You cannot just wish that evidence away.

Another example of magical thinking in the global warming "debate" is the claim that global warming is due to water vapor.  Why is this "magical" thinking?  Well, there's the fact that water vapor is controlled by air temperature and therefore cannot control air temperature by itself (remember the Clausius-Clapeyron relation?).  Then there's the fact that if water vapor is causing global warming, you must explain why water vapor suddenly started acting to warm the planet since AD 1900, after a 5,000 year period of a cooling trend.  Just citing water vapor and not stating what caused water vapor to suddenly warm the planet is pure magical thinking as everything must have a physical cause.

As for the evolution "debate", magical thinking abounds, from claims that a 1-year, worldwide flood could magically change the rate of radiometric decay to the claim that the geologic column is due to a single flood to claims that information theory disproves evolution.  The Talk Origins website has an extensive catalog debunking various creationist claims.   The claim about radiometric decay is particularly ludicrous in light of the amount of heat produced.  The average rate of heat from radiometric decay that reaches the Earth's surface today is 47 trillion Joules/second (Davies and Davies 2010).  Accelerating that by 1 billion would mean an average of 47 septillion Joules/second of heat—more than enough heat to vaporize the oceans and melt the planet.  As for the geologic column–flood claim, there are several rock layers scattered throughout the geologic column which are laid down slowly and only in quiet water (i.e. shales) and therefore could not have been formed by a flood.  The information theory claim has been debunked multiple times (i.e. here, here, and here), mainly because neither Shannon information theory or Kolmogorov-Chaitin theory truly apply to living organisms.

Last and most glaringly, we've failed to teach critical thinking.  Critical thinking is the ability to ask "Does this {new discovery, data, opinion, etc} make sense in light of what we already know about this subject?"  What we mostly teach in science class is simply rote memorization—we teach theories and facts but don't teach students how to tie those facts and theories together.  Are there exceptions to this generalization?  Certainly.  But those are unfortunately the exception rather than the rule.  And it's the lack of critical thought that magnifies deficiencies in teaching the basic facts and the magical thinking.

As for how to correct these issues,  I suggest a two-pronged approach.  First, rather than rote memorization, I have started presenting facts, then asking students to evaluate those facts based on their prior knowledge, and then to draw conclusions based on the total body of knowledge.  When covering evolution (haven't reached that section yet), I will be spending more time laying out step-by-step the discoveries that lead to our current understanding of the geologic column before diving into natural selection and the Hardy-Weinberg Theorem.  For global warming at the end of the ecology section, I've already started rewriting my lecture to include more of the basic facts and concepts (i.e. the laws of thermodynamics), and history of the discovery of the greenhouse effect and the gases that comprise it.  Yes, this approach takes more time and effort, but I believe I'll have better informed students at the end.  Ideally, this process would begin in elementary school rather than the first year of college but better late than never.

Second, we simply need more scientists to get involved explaining the basics to the general public, countering the misinformation coming from climate science denier and creationist camps.  I know that most scientists are more comfortable hiding in laboratories and behind computer screens but it's really the only way we're going to change the course of public debate in the US.

Wednesday, September 18, 2013

Rates of change

One common misunderstanding about how the current global warming differs from past episodes of warming is the rate of warming.  In this post, I'll show how the rate over the past 30 years stacks up with two of the better-known rates from geologic history.

Past 30 years (1983-2013) rate ± standard error:
UAH: +0.015379 ± 0.003783ºC per year
GISS: +0.015505 ± 0.002491ºC per year
NCDC: +0.014454 ± 0.002489ºC per year
HadCRUT4: +0.014896 ± 0.002824ºC per year

Depending on the data set, the rate of the last 30 years ranges from 0.014454ºC per year up to 0.015505ºC per year.  When I average the four data sets together then calculate the rate, the result is +0.014692 ± 0.003070ºC per year for the last 30 years.

For the geologic rates, let's start with the most recent and work backwards in time.

Over the 5,000 years since the end of the Holocene Climatic Optimum, the Earth slowly cooled by 0.7ºC (Marcott et al. 2013).  That's an average rate of  -0.00014ºC per year.  The current rate is 104x faster than the rate over the previous 5,000 years.

At the end of the last ice age, a process lasting between 22,000 to 11,000 BP, global temperatures rose an average of 3.5ºC over 8,000 years (Shakun et al. 2012).  That translates to an average of 0.0004375ºC per year, which means the average rate of change over the last 30 years is 33.5x faster than at the end of the last ice age.

During the Paleocene-Eocene Thermal Maximum (55 million years BP), global temperatures rose an average of 6.5ºC over 19,000 years (Cui et al. 2011).  This warming was triggered by a release of an average of 6.2 billion metric tons of CO2 per year.  In contrast, humans released the equivalent of 34.8 billion metric tons of CO2 (9.5 billion metric tons Carbon) in 2011 alone (Quéré et al. 2012).  Not only is the rate of CO2 release greater, the rate of change in temperature over the last 30 years is 43x greater than it was then.

Less well known but a subject of active research is the End-Triassic Mass Extinction 210 million years ago.  This extinction was a time when 50% of species went extinct due to a massive disruption of the carbon cycle and climate.  While I cannot find an estimate of the temperature change, the amount of CO2 has been estimated at 12 trillion metric tons of carbon over a period of 10,000 to 20,000 years (Ruhl et al. 2011), which averages out to between 0.6 billion metric tons and 1.2 billion metric tons per year, far lower than the 9.5 billion metric tons of carbon human activities released in 2011 alone.  The cause of that carbon release is disputed, with Ruhl et al. favoring methane clathrates whereas Blackburn et al. (2013) ascribed it to massive volcanic eruptions triggered as Pangea broke apart.

The lesson from examining geologic history is quite clear.  The current rate of temperature is simply far faster than what has been experienced in the past.  This graph of Holocene temperatures sums up that point nicely:

The only thing standing between us today and the major ecosystem changes, species extinctions, etc noted in the geologic record is time, as the current rate of change is more than fast enough to trigger such events if it continues.  Add in the fact that the Milankovitch cycles, which have been the primary drivers of the Ice Ages, are 6,000 years into a 29,000-year cooling phase (Imbrie and Imbrie 1980) and it's not too hard to see that the current warming is counter to the natural cycle and occurring too fast to be natural.

Sunday, September 15, 2013

Arctic temperature vs sea ice extent

One seemingly persistent myth about the Arctic is that there is no correlation between Arctic air temperature and sea ice extent.  At first glance, that myth appears to be true, as Arctic temperatures have noticeably risen whereas sea ice extent shows little overall change.

The correlation appears even worse when plotting sea ice extent versus temperature directly.


Taking a closer look, however, reveals the reason for the apparently poor correlation: The large seasonal cycle in sea ice extent data.  The cycle obscures the overall trend in sea ice data—and the correlation of that trend with the trend in Arctic tropospheric temperature.  Once that cycle is removed via a 12-month moving average, the trend in sea ice extent and the negative correlation between extent and temperature is clearly revealed.


The direct comparison shows that the decline in Arctic sea ice extent has accelerated as Arctic tropospheric temperature increased.
Extent = 11.6846713 + -0.9177708x + -0.2355697x2, where x = temperature

The R2 value for that correlation is quite high, R2 = 0.7865 and p-value = <0.00000000000000022.  Now does that correlation prove that higher Arctic tropospheric temperatures caused the sea ice decline?  No.  Correlation by itself does not imply causation.  It could be warmer ocean currents and/or changes in wind patterns.  However, that argument doesn't address what caused the ocean currents to warm and the wind patterns to change—and the answer is increasing air temperatures cause the oceans to warm and change wind patterns.  No matter what, the physical cause of the ice melt ultimately goes back to warmer air temperatures.  When you have a correlation backed by a physical mechanism, that DOES imply causation.

Tuesday, September 10, 2013

Is it a recovery or not?

One of the current rumors circulating in climate change denier circles is that the Arctic sea ice is recovering, with a record ice gain, and that Arctic ice in August 2013 60% higher than in August 2012 and is the highest in "years."  Let's examine those claims.

First, here's a graph showing the 12-month moving average of Arctic sea ice from January 1979 to August 2013:


Not much to say there.  So far, the 12-month moving average shows no sign of any recovery.  Arctic sea ice extent remains far below the 1979 start point or even where it was before 2005.  However, the claim is that the ice gain since September 2012 set a record.  Normally, Arctic ice extent reaches the yearly minimum in September at the end of summer, with a maximum the following March at the end of winter.  The ice gain is the difference between those months.


March extent has been declining linearly by an average of -36,581 km2 per year whereas the decline in September has been by an average of -87,112 km2 and accelerating by -7,786 km2 per year.  As a result of the accelerating losses in September, the yearly difference between March and September has been growing at an accelerating rate.

Why was the "gain" between September 2012 and March 2013 a record?  Simple.  September 2012 was the lowest Arctic sea ice extent on record.  That left plenty of open water to refreeze during the winter, leading to a record "gain" by March 2013.  That denier claim that the Arctic has experienced a record "gain" of ice is just a cynical attempt to distract from the real story of continued decline in Arctic sea ice.  In reality, both March and September sea ice extents continue to decline—and the "record gain" is actually a symptom of that decline, not a sign of recovery.

The second claim that Arctic sea ice is 60% higher in August 2013 than it was in August 2012 is partially true—the ice was higher in August 2013.  But it wasn't by 60%.  Average ice extent in August 2013 was 6.05 million km2.  Ice extent in August 2012 was 4.72 million km2.  That is a difference of 1.33 million km2, which means that the ice in August 2013 was 28.2% higher than it was in August 2012 (1.33/4.72 * 100).  As for the claim that August extent is the highest in years, here's a graph of August sea ice extent since 1979.

The last time August sea ice extent was higher than August 2013's extent of 6.05 million square kilometers?  August 2009, with 6.13 million square kilometers.  And even though August 2013 is supposedly the "highest in years", it's the 6th lowest August ice extent since 1979.  Not much to go on if you're trying to claim that Arctic sea ice is recovering.

Finally, sea ice extent is not the full story in the Arctic.  A larger but less noticeable change is that the sea ice has become much thinner.


The ice has lost nearly 12,000 km3 of volume since 1979.  Even if ice extent does grow, all it means is that the surface is covered with thinner, less stable ice than it used to be.  And thinner ice is more susceptible to melting when conditions are right.

Lastly, there's a little concept in statistics called "regression toward the mean".  In simple terms, data tends to fall around the average, with extreme outliers followed by data points that are closer to the average.  How is that relevant to Arctic sea ice extent?  September 2012 was quite a bit below the trend line.  It's an outlier.  Therefore, we'd expect the monthly average for September 2013 to be closer to the average.  Extrapolating from the trend line gives an expected September 2013 extent of 4.15 million km2.  That represents a 16% increase over the average in September 2012 (3.58 million km2) even if the overall trend remains the exact same.  Figuring in the 95% confidence interval for the trend (4.66 million km2) shows that the average September 2013 sea ice could show an increase of 30% even if the trend remains the exact same.  This doesn't even consider the standard deviation of the data around the trend (± 1.32 million square kilometers), which means that 2013 September ice extent could be as high as 53% higher (up to 5.47 million square kilometers) than in 2012, even if the current trend, standard error, and standard deviation are unchanged.



The main message from the Arctic is simple: The ice continues to melt, regardless of what tabloid articles and deniers say.

Monday, September 9, 2013

Solar influence on climate change

The degree to which the sun impacts climate change is hotly debated, mostly in climate change denier circles, with claims that the current warming is due to the sun.  That claim, however, ignores the actual science.  There have been multiple research studies published since 1998 that show that the sun has very little to do with the current global warming episode.  Solanki et al. (2004), Usoskin et al. (2005), and Scafetta and West (2006) used reconstructions of solar activity to show that the sun contributed little to warming since the 1970s.  Lockwood and Fröhlich (2007) showed that trends in solar output and activity since 1988 are opposite what would be required for the sun to cause global warming.  Meehl et al. (2004), Ammann et al. (2007), and Huber and Knutti (2011) used climate models to show that solar output and other natural climate forces could not replicate the observed temperature trend without adding anthropogenic carbon dioxide.  Lean and Rind (2008) and Foster and Rahmstorf (2011) used regression to show that the sun has been a cooling influence since 1979.

Most of the above studies were done using total solar irradiance (commonly denoted as "S"). However, as the Stefan-Boltzmann law shows, it's the amount of solar energy that actually reaches the Earth's surface that counts.  Approximately 30% of incoming solar radiation is reflected directly into space by clouds, aerosols, snow and ice, and other components of Earth's albedo.  That amount fluctuates due to unpredictable events like volcanic eruptions and increases in the amount of coal burned around the planet.  Around 70% actually makes it to the ground where it is absorbed and changed to infrared radiation.  Now a new study examines how the amount of solar energy reaching the ground has changed since AD 1900.  Wang and Dickinson (2013) used daily temperature range (DTR) as a measure of sunlight reaching the ground.  They found good correlations between DTR and direct solar energy measurements, with many areas having correlation coefficients of >0.7.

Figure 1 from Wang and Dickinson (2013).

They used DTR to calculate changes in the amount of sunlight reaching the ground for each station since AD 1900, then computed regional and global averages.  Their Figure 6A shows the impact changes in sunlight have had on global temperatures.

Figure 6A from Wang and Dickinson (2013)

Of particular note is that solar energy reaching the ground peaked between 1930 and 1960 and has declined until the early 1980s, with a slight recovery since.  That pattern corresponds with estimates of sulfur aerosol emissions due to coal burning power plants (Smith et al. 2011), which peaked in the early 1980s and have declined since due to various Clean Air Acts around the world.

Figure 2 from Smith et al. (2011)

However, the pattern in sunlight reaching the ground does not correspond with global temperature, especially after 1970.  If global warming was due to the sun, then we'd expect a close correlation between solar energy at ground level and global average temperature.  Whereas ground-level solar energy has remained low since 1980 with little or no increase, global temperatures have increased quite noticeably.  This mismatch shows once again that changes in the sun have very little to do with the global warming trend we've experienced since the 1970s, corroborating all the other research on the subject that finds little to no solar influence on the current warming trend.


Friday, August 30, 2013

Revisiting the question of "Has global warming stopped since 1998?"—again.

Let me be blunt: There is little evidence that global warming stopped in 1998 or any year thereafter.  Most of the evidence we have, from the energy imbalance to total heat content to ocean heat content, show that global warming continues, as I previously explained here, here, and here.  The only piece of evidence that appears to show that global warming has stopped is that the trend in surface temperature data is not statistically significant in recent years.  However, that is at best ambiguous.  No significant trend could mean that warming continues but short-term variation in the data masks the trend, that there's no warming or that there's a cooling trend but not enough data for that to be significant.  There's no real way to tell unless you either a) add enough data for short-term variation to cancel out or b) use statistical techniques to factor out the known natural variation.

In this article, I expand on my previous analyses of surface temperature, this time using GISS, NCDC, HadCRUT4, and UAH.  Previously, I had focused on only GISS or UAH.  I first adjusted all temperature anomalies to the 1981-2010 baseline and used the technique of Foster and Rahmstorf (2011) to factor out short-term variation due to ENSO, solar cycles, and aerosols between January 1979 and March 2013.  For each start year between 1990 and 2000, I estimated the amount of autocorrelation in the temperature data before and after adjusting for short-term variation using auto.arima in the nlme package for R and then the autocorrelation-corrected trend using generalized least squares from the forecast package.  I plotted the trends ± 95% confidence intervals for raw and adjusted data to compare how the trends change depending on start year.

First, here's the raw data since 1990:

Global temperature data before adjustments for ENSO, aerosols, and the solar cycle


Not much to say that hasn't already been written.  All the data show an increase since the 1990s, with an apparent flattening of the trend since 1998 and, in the three surface datasets, a slight cooling since 2005.  After factoring out for ENSO, the solar cycle, and changes in aerosols, the four datasets look like this:

Global temperature data after adjusting for ENSO, aerosols, and the solar cycle.
While variation remains, the overall trend in each data set is much easier to see after short-term variation is factored out. Now the apparent flattening doesn't begin until after 2010.  Of course, saying that temperatures have been flat for the past 3 years doesn't have quite the power of saying that they've been flat since 1998.  Plotting regression trends starting between 1990 and 2000 reveals that the trend in the adjusted data is higher than the trend in the unadjusted data and that the trend in the adjusted data is more stable than the trend in the unadjusted data.


The trend in adjusted GISS data decreases only slightly, although the change is not statistically significant, as shown by the overlap between the 95% confidence intervals.  The decline in the trend in unadjusted data is much larger.  However, that change is also not statistically significant as the 95% confidence intervals also overlap.  While the trend since 1997 is not significantly different from zero in the unadjusted data, the trend in the adjusted data remains statistically significant throughout.  The same pattern appears in the other three sets of data.

NCDC:


HadCRUT4:



UAH:


While the the unadjusted data generally shows a decline in the rate of warming depending on the start point, the decline is not statistically significant due to large standard errors and confidence intervals.  The trend in the adjusted data generally declines as well in the surface data but interestingly not in UAH satellite data.  With less variation in the adjusted data, the standard errors of the adjusted trends are far smaller and the trends are significant regardless of start point.  The differences between the trends in adjusted versus unadjusted data clearly show the influence of short-term variation, as represented by ENSO, aerosols, and the solar cycle, in determining the trend in unadjusted temperature data, especially as the time period gets shorter.

There has been numerous claims that IPCC climate models are wrong as they don't match surface temperature data since 1998 (i.e. Fyfe et al. 2013).  While true, the reason the models don't match is not because CO2 doesn't affect global temperature or that anthropogenic climate change theory is wrong as many deniers are claiming.  For either of those to be wrong, much of our understanding of atmospheric physics would have to be wrong as well, to say nothing of 152 years worth of experimental results.  The reason most climate models don't match is that generally they account for the long-term influence of CO2 but do not accurately predict short-term variability (i.e. Rahmstorf et al. 2012).  Few climate models can accurately predict the timing of past ENSO events much less predict the timing of future events; none predict changes in aerosols or solar output.  Short-term variation also cancels out in longer model runs or when averaging multiple models together.  In addition, the models overestimate the amount of change in greenhouse gas emissions and the resultant change in climate forcing (see Spencer Weart's interview with James Hansen in 2000).  It's therefore not surprising that the IPCC models do not match surface temperatures since 1998.  Once climate models are corrected for actual short-term variability and climate forcings, they replicate the slow-down in warming since 1998 quite well and show that ENSO is responsible for most of the slow-down (i.e. Kosaka and Xie 2013).

Two comments on Fyfe et al. (2013).  First, starting a temperature trend at a known outlier is bogus, as it artificially lowers the trend.  You can clearly see in the trend plots above that trend starting at 1998 are far lower than trends starting at any other year.  Starting at 1998 guarantees that the difference between observed and predicted trend will appear larger than starting at any other year.  Second, the data set they used, HadCRUT4, is known to show less warming than other data sets since AD 2000 because HadCRUT4 has poor geographic coverage of the polar regions, which just happen to be the fastest warming regions of the planet.  In short, Fyfe et al.'s choice of temperature data set and start point effectively biased their analysis toward finding that model trends were higher than observed trends, whether true or not.  For that reason, I'm surprised that Fyfe et al. (2013) got published.  Their analysis would have been much stronger if they had included other data sets (i.e. UAH, GISS, NCDC) and other start points.

What can be concluded about the slow-down in surface warming since 1998?  First, that much of the slow-down is due to short-term variation in the climate system.  A predominant cooling phase of ENSO, an increase in aerosols, and a deep solar minimum have combined to mask the long-term warming trend.  Second, that the warming trend continues unabated as shown once ENSO and other known variation are factored out.  This means that once the short-term variation swings back toward warming phases, we'll see faster warming that we otherwise would expect.  This isn't surprising, as physical data such as the total heat content of the Earth continues to rise and the energy imbalance continues as I've written before.  Third, just because the IPCC models are wrong does not mean the entire science is wrong.  The models don't incorporate all that we know about climate and therefore are imperfect reflections of the science.  Lastly, beware those who proclaim that global warming has stopped since 1998 based on the linear trend.  If those people were telling the full story, they'd have to admit that the rate of warming since 1998 is statistically indistinguishable from the rates of earlier years, as shown by the overlapping 95% confidence intervals.