December 2013 was the 346th consecutive month where global land and ocean average
surface temperature exceeded the 20th century monthly average, with February 1985
the last time mean temperature fell below this value. Even given these and other extraordinary
statistics, public acceptance of human induced climate change and confidence in the
supporting science has declined since 2007. The degree of uncertainty as to whether
observed climate changes are due to human activity or are part of natural systems fluctuations
remains a major stumbling block to effective adaptation action and risk management.
Previous approaches to attribute change include qualitative expert-assessment
approaches such as used in IPCC reports and use of ‘fingerprinting’ methods based on global
climate models. Here we develop an alternative approach which provides a rigorous
probabilistic statistical assessment of the link between observed climate changes and
human activities in a way that can inform formal climate risk assessment. We construct
and validate a time series model of anomalous global temperatures to June 2010, using
rates of greenhouse gas (GHG) emissions, as well as other causal factors including solar
radiation, volcanic forcing and the El Niño Southern Oscillation. When the effect of GHGs
is removed, bootstrap simulation of the model reveals that there is less than a one in
one hundred thousand chance of observing an unbroken sequence of 304 months (our
analysis extends to June 2010) with mean surface temperature exceeding the 20th century
average. We also show that one would expect a far greater number of short periods of
falling global temperatures (as observed since 1998) if climate change was not occurring.
This approach to assessing probabilities of human influence on global temperature could
be transferred to other climate variables and extremes allowing enhanced formal risk
assessment of climate change.