back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [ 113 ] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166]


113

The analytical results of Cochrane (1991) imply that tests for the distinction between deterministic and stochastic trends in the data can have arbitrarily low power. However, it is interesting to note that when the true data generation process contains a deterministic trend, the Durbin-Hausmann tests can detect this much more easily than DF tests. This is evident from Monte Carlo studies and in empirical tests on data from the physical sciences that have a deterministic trend (Alexander and Rendell, 1995).

There is a large literature on empirical studies of the data generation processes that govern price (and log price) data from financial markets. The overwhelming conclusion from the research on whether non-stationary behaviour is governed by trend-stationary or difference-stationary processes is that financial market data have stochastic rather than deterministic trends. That is, the model (11.7) is more appropriate than (11.9) to apply to either prices or log prices of financial assets.

The overwhelming conclusion is that financial market data have stochastic rather than deterministic trends

11.2 Univariate Time Series Models

This section describes the basic building blocks for modelling stationary univariate time series. A univariate time series model can only be explained by its own lagged values, that is with autoregressive (AR) terms as explanatory variables in its representation. If the process is stochastic and stationary the errors can be linear combinations of white noise at different lags, so the moving average (MA) part of the model refers to the structure of the error term. General stationary time series have both autoregressive and moving average parts to their representation, but before discussing these autoregressive moving average (ARMA) models, more detailed descriptions of AR models and MA models are presented separately.

11.2.1 AR Models

Autoregressive models of time series are representations of a time series by functions of its own lags. For example, the autoregressive model of order 1, the AR(1) model, is

yt = + ay, i + e„ (11.12)

where 8, ~ i.i.d.(0, a2). The constant term models a trend in the series either upwards (c > 0) or downwards (c < 0). The lag coefficient a determines the stability of the process. If a \ > 1 the time series will explode, that is y, -» ±oo as t -► oc. The special case a = 1 gives the random walk model of §11.1.3, and it is only when a < 1 that the process defined by (11.12) will be stationary. Using the lag operator (§11.1.1), an equivalent representation of (11.12) is

(1-clL)j, = c+e,.



Suppose j ot I < 1. Since

(1 - oX)-1 = 1 + aL + a2L2 + a3L3 + . . . ,

another form of the AR(1) model is

y, = (1 -aLyl(c+e,) = (1 - olL) 1c + (1 -aL)~let

or, since (1 - aX) 1 e, - e, + ae, i + a2e, 2 + a3er-3 + • • •.

yt = c/(l - a) + e, + a8, ! + a2e, 2 + a3e, 3 + .... (11.13)

Taking expectations and variances of (11.13) gives

E(y,) = c/(\-a) and V(yt) = a2/(l - a2). (11.14)

The autocovariances of the AR(1) model are independent of the constant term:6

E(yty,-S) = a*o2/(l - a2). (11.15)

To see this, use (11.13) to write

cov(j„ yt s) = E[(et + ast i + 2 , 2 + aht 3 + . . .)

x (e, s + a8( s j + a28, s 2 + a3e, ,3 + . . . )]

and then use the fact that E(etet s) - 0 unless s = 0 and E(s2) = a2.

Higher-order autoregressive models have more complex properties. For example, the AR(2) model,

y, = + 1 ]+ a2jv2 + 6„ (11.16)

may be written using the lag operator as

(1 -axL-a2L2)y, = c+Et. This model is only stable if the characteristic polynomial of the AR(2) model,

1 - - cl2X2,

has roots that lie outside the unit circle in the complex plane. This is the generalization of the stationarity condition a \ < 1 for series represented by an AR(1) model. So, for example, if a time series has the AR(2) representation

, = + 2y, i - yt 2 + e„

then it is non-stationary because the roots of 1 - 2x + x2 are (1 ± i*/3)/2 and these have modulus 1, so they lie on the unit circle.

6So they are the same as for the AR(1) model with no constant term, already given in (11.4).



It is straightforward to calculate the mean and variance of a stationary AR(2) process, but the covariance calculations are more lengthy.7 For the AR(/j) model,

yt = + axy, x + a2y, 2 + ... + apy, p + e,. ( 7)

It follows immediately that E(y,) = c/(l - a, - a2 - ... - ap) and

Vi.yt) = Yo = ctiTi + a2Y2 + • • + apJP + o"2, (11.18)

where ys = cov(y„ y,-s)- For s > 0 the autocovariances follow the same pth-order difference equation as the process itself, that is,

cov(y„ yv,5) = Ys = ctiYs-i + a2Y,-2 + • • + apysp. (11.19)

Equations (11.18) and (11.19) can be solved to give the variance and autocovariances of a stationary AR(p) process in terms of the model parameters a,, a2, . . ., ap and c2.

The representation (11.13) shows that an AR(1) model is equivalent to an

infinite moving average of white noise. The inversion of an AR(1) into an ZenilTan infinite MA(oo) holds for any value of a. It is for stationarity that the condition moving average of white I ct I < 1 must apply. Similar remarks apply to the general AR(/>) model: it is invertible into a infinite moving average representation for any values of the coefficients, but constraints on the coefficients are necessary for stationarity.

Note that for the AR(1) model the stationarity condition a < 1 could have been expressed (less succinctly) in a similar form to the stationarity condition for an AR(2). That is, the root of 1 - ax must lie outside the unit circle. In general, the stationarity condition for the AR(p) model is that the roots of 1 - axx - a2x2 - . . . apxp must lie outside the unit circle. Series that have AR(/j) representations containing one or more roots of 1 - axx - a2x2 - . . . apxp with modulus 1 are non-stationary.8

An AR( 1) model is

noise

11.2.2 MA Models

Moving average models have already been encountered in Chapter 3, where they were applied to squared returns and to cross products of returns, with equal or exponentially declining weights. In this way they are used to estimate the unconditional variance or covariance of a time series of returns. In time series analysis a basic building block for models of stationary series is again a moving average, but this time of a white noise process. The first-order moving average model. MA(1), is

The interested reader is referred to Hamilton (1994), where it is shown that autocovariances follow the same second-order difference equation as the model itself.

"The term unit root refers to this stationarity condition.



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [ 113 ] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166]