Modelling Current Temperature Trends

Current trends in Northern Hemisphere and Central England temperatures are estimated using a variety of statistical signal extraction and filtering techniques and their extrapolations are compared with the predictions from coupled atmospheric-ocean general circulation models. Earlier warming trend epochs are also analysed and compared with the current warming trend, suggesting that the long-run patterns of temperature trends should also be considered alongside the current emphasis on global warming.


Introduction
Global warming and climate change are currently topics of major importance, particularly with the publication of the Stern Review on the Economics of Climate Change (Stern, 2007) and the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (IPCC, 2007).Although climate modelling and long-run temperature prediction are typically done using large scale computer simulations (see, for example, Cubasch andMeele, 2001, andCovey et al, 2003), an interesting statistical problem is to estimate recent and current trend levels of temperature using statistical signal extraction and filtering techniques.We emphasise that this approach is purely statistical and avoids the need to identify potential covariates, but as such it has the benefit of providing independent results that may be useful in informing quantitative debate on global warming.Section 2 thus sets out the trend estimation techniques employed in the paper, while Section 3 presents the results for the available instrumental records of Northern Hemisphere and Central England temperatures obtained using them.Section 4 offers some comments and discussion.

Trend Estimation Techniques
Although there are many ways of estimating trends, we focus here on the 'classic' signal extraction and filtering approach of estimating the unobserved trend component of a temperature time series using just the temperature record itself.Even within this framework, a number of alternative techniques are available, and we concentrate here on just three: a parametric stochastic trend model, a nonparametric local trend fit, and a low-pass filter.Such techniques are discussed in, for example, Mills (2003) and Pollock (2007), where the links between them are established.Specific applications to temperature trends may be found in Harvey and Mills (2003) and Mills (2006Mills ( , 2007aMills ( , 2007b)).
The general framework is to decompose the series of T observed temperature values, say Z t , into unobserved trend and noise components, S t and N t respectively, such that Z t = S t + N t .The stochastic trend model used here is the 'smooth trend', in which the trend component evolves as a random walk with the 'slope', β t , also following the random walk The slope innovation ν t and the noise component N t are assumed to be independent zero mean white noises, i.e., temporally independent sequences with constant variances.(The conventional stochastic trend model also has an innovation to the trend specification but, on estimation, its variance was found to be close to zero and so it was omitted.Its exclusion produces a very smooth evolution of the trend, hence the nomenclature 'smooth trend'.)The unknown variance of ν t and the components S t , β t and N t may be estimated by casting the model into state space form and employing the Kalman filter.An estimate of the variance of ν t may then be obtained using the EM algorithm, which is a recursive method of obtaining maximum likelihood estimates of the unknown elements of the state and error system matrices of the state space form.Estimates of the components are then calculated using a smoothing procedure, which first estimates the components recursively for t = 1, . . ., T and then runs a second 'backward' recursion over t = T, . . ., 1 to obtain the smoothed estimates.Full details of the estimation procedure may be found in Koopman et al (2006).
The nonparametric trend is a local cubic polynomial fitted using a Gaussian kernel.At each point in time τ the parameters of the cubic polynomial are chosen to solve the weighted least squares problem of minimizing and the kernel function is Here b > 0 is a smoothing parameter known as the bandwidth: the larger b is, the 'greater' the degree of averaging and the smoother will be the estimated trend function S t,τ (b).Thus with the Gaussian kernel, b plays the role of 'standard deviation'.Although a variety of kernels are available, only small differences tend to be found using alternatives: the Gaussian seems to perform as well as any other and is easily interpretable.
A cubic polynomial is chosen because odd orders of the local polynomial lead to smaller asymptotic biases in Ŝt,τ (b), caused by having to truncate the kernel weights at the start and end of the sample, than even orders.Choosing a cubic polynomial was found to produce the best trade-off between complexity and smoothness: the case of a local linear trend has been considered in Mills (2007a).The bandwidth is chosen through cross validation, which uses the idea of 'leave-one-out' prediction by minimizing the criterion where Ŝ(τ) t,τ is the estimate of the trend based on the truncated sample Z 1 , . . ., Z τ −1 , Z τ +1 , . . ., Z T , i.e., with observation τ removed.Pollock (2007) shows that the local polynomial approach can be expressed as a set of global moving average coefficients applied to the complete record of Z t and so can be regarded as a form of filtering technique.
The low-pass filter is set up to pass only those frequencies that can be considered to be part of a long-run, slowly varying, trend, and this can be done by analysing the spectral density of the series.Although various low pass trends are available (for example, the Butterworth filters discussed by Pollock, 2000), we use here the Hodrick and Prescott (1997) trend filter, which is obtained as the solution to the problem of minimising where ξ is a Lagrangean multiplier that has the interpretation of a smoothness parameter, so that the higher the value of ξ, the smoother the trend.The problem is thus one of minimising the variation in the trend component subject to a smoothness condition that penalises acceleration in the trend and it has a long  history of use in actuarial science and, more recently, in macroeconomics.The first order conditions may be written as and these can be solved, with suitable modifications for the end-points of the sample, to obtain a trend sequence S t (for details, see Mills, 2003).This filter is related to the smooth trend model in that the implied filter is identical to that of the smooth trend model if ξ is set as the 'noise-to-signal' variance ratio, i.e., the ratio of the variance of N t to the variance of ν t .More generally, ξ can be estimated by setting the gain of the implied frequency response function to 0.5.This yields ξ = 1/4(1 − cos(λ)) 2 , where λ is the frequency for which the gain is 0.5 (see Mills, 2006).

Estimating the trend component of Northern Hemisphere and Central England Temperatures
We focus on two well known temperature records.The first is the Northern Hemisphere instrumental temperature record, which is available in annual form from 1856 to 2005.This series is shown in Figure 1, with the three trend fits superimposed.The free parameters in each of the fits are as follows.In the stochastic smooth trend model, the variances of N t and ν t are 0.1064 2 and 0.00361 2 , implying a noise to signal variance ratio of 869.For the low-pass filter, examination of the spectral density of the series suggests setting λ = π/16 ≈ 0.2, so that ξ = 3020, somewhat higher than the implied value from the smooth stochastic trend model.The estimated low-pass trend will thus be a little smoother, and this is indeed seen in Figure 1.For the local cubic trend, cross validation produced an estimate of b = 12 for the bandwidth.
All three trends show very similar evolutions: roughly constant until around 1910, then a trend increase up to 1948 followed by a small decline until the late 1960s, whereupon there has been a pronounced warming trend, with trend temperatures at the end of the period being almost one degree warmer than 150 years ago.Figure 2    As can also be seen from Figure 2, the trend slopes are all approximately constant, at around 0.03 o C per annum.This implies that, at this current rate of trend increase, Northern Hemisphere temperatures will be some 3 o C higher by this time next century.Being parametric, the stochastic trend model provides a standard error for both the current trend level and slope: these are 0.05 and 0.01 respectively.Forecasted trends will also have standard errors.Although the forecasted trend in 2105 will be around 3.6 o C (above the 1961-1990 mean), it will have a standard error of 2.3 o C attached to it, indicating the imprecision with which such long run forecasts are necessarily accompanied by: a 70% prediction interval runs approximately from 1.3 to 5.9 o C.This long-range trend forecast is very much in line with the projections made by the Met Office's Hadley Centre coupled atmospheric-ocean general circulation models, HadCM2 and HadCM3, using a 'business as usual' scenario that assumes mid-range economic growth but no measures to reduce greenhouse-gas emissions 1 .For the U.K., a longer instrumental temperature record is available, as the Central England temperature (CET) series begins in 1659.This is plotted in Figure 3 along with the three estimated trends.Updating Harvey and Mills (2003), the estimated variances of the stochastic smooth trend model imply a noise-to-signal ratio of 1043, while the low-pass filter sets λ = 0.023 and hence ξ = 10, 000.Both filters are thus extremely smooth.For the local cubic polynomial, b is set by cross validation at 35.Although the trend pattern over the last 150 years is very similar to that for the NH series, this longer series reveals other interesting features, notably a pronounced cooling pattern in the latter half of the 17-th century, followed by a recovery until 1750 and then a period of stable trend temperatures before the onset of the 20-th century warming trend (the pattern before the current warming trend is often referred to as the 'Little Ice Age').Here the trends vary more between themselves, although the stochastic trend and low-pass filter are very close at the end of the record.The current stochastic trend is 10.38 o C with a current slope of 0.04 o C, these having standard 1 See http://www.metoffice.gov.uk/research/hadleycentre/models/modeldata.html.

Discussion
An obvious response to such calculations is to invoke the 'perils of extrapolation' argument and to ask the question 'have there been past warming patterns similar to that being observed currently that haven't led to a continuing upward trend in temperatures?'The NH temperature record does appear to be currently undergoing an unprecedented warming trend, so that there are no earlier periods of similarity with which to compare our current trend estimates with.The CET series, however, is almost 200 years longer, and examining Figure 3 reveals that there are (at least) two earlier periods that display similar behaviour to the current warming trend.Figure 4 therefore compares the low-pass trend fitted to the complete record with similar trends fitted to the record ending in 1736 and 1834 respectively (the other trend fits are very similar).Prior to 1736, trend temperatures increased by 1.5 o C in the 46 years since 1690, while the 24 years from 1810 saw trend temperatures increase by 0.75 o C. By comparison, the current warming period has seen trend temperatures increase by almost 1 o C in forty years.Both the earlier trends have 'current' slopes, estimated to be 0.047 and 0.045 respectively, in excess of the 2005 slope of 0.040 o C, and both periods contain temperature extremes that are comparable to those reached in the last decade.As can be seen from Figure 4, both trends quickly reversed themselves after these two dates, which were, of course, before serious industrialisation had occurred.
Thus the recent warming trend in the CET series is by no means unprecedented.While we are not suggesting that the current warming trend will necessarily be quickly reversed, this statistical exercise reveals that examining temperature records over a longer time frame may offer a different perspective on global warming than that which is commonly expressed.Indeed, examining much longer records of temperature reconstructions from proxy data reveals a very different picture of climate change than just focusing on the last 150 years or so of temperature observations, with several historical epochs experiencing temperatures at least as warm as those being encountered today: see, for example, Mills (2004Mills ( , 2007b) ) for trend modelling of long-run temperature reconstructions.At the very least, proponents of continuing global warming and climate change would perhaps be wise not to make the recent warming trend in recorded temperatures a central plank in their argument.
zooms in on the last decade from 1995: the current (2005) trend temperature (as an anomaly from the 1961-1990 mean) is estimated to be 0.64 o C by the stochastic trend model, 0.65 o C by the low-pass filter, and 0.66 o C by the local cubic polynomial.These all stand at record highs, having increased by almost 0.3 over the last decade.

Figure 4 :
Figure 4: Annual Central England temperatures (CET), 1659-2005, in degrees C, with low-pass trends superimposed.errors of 0.20 and 0.02 respectively.Here the forecasted trend in 2105 is 14.08 o C (an increase of 3.7 o C) with a standard error of 3.43, again in line with the trend forecast from the NH series and the forecasts from the HadCM2 and HadCM3 general circulation models.