back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [ 180 ] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


180

(1 - a,L - a-iL ~a,U)X, = e, (13.3)

1 - a,L - aiL • a,U) • 1

(1 - 7t,L)(1 - 7t2l) • • (1 - 7t,L)

where -it,, ttj, . . . , ir, are the roots of the equation

r - a,F- - «2 ... a, = 0

The condition that the expansion of (13.3) is valid and the variance of X, is finite is that , < 1 for all /.

To find the acvf, we could expand (13.2), but the expressions are messy. An alternative procedure is to assume that the process is stationary and see what

E. E. Slutsky, "The Summation of Random Causes as a Source of Cyclic Processes," Econometrica. Vol. 5, 1937, pp. 105-146.

[The last condition is derived from the fact that 2 = , 2, the product of the roots. The first two conditions are derived from the fact that if fi] - Afi > 0, then ifi] - 4P2) < (2 + fixf or ifi] - 4P2) < (2 - p,)M

Moving-average processes arise in econometrics mostly through trend elimination methods. One procedure often used for trend elimination is that of successive differencing of the time series X,. If we have

X, = + a,t + ujf + e,

where e, is a purely random process, successive differencing of X, will eliminate the trend but the resulting series is a moving-average process that can show a cycle. Thus the trend-eliminated series can show a cycle even when there was none in the original series. This phenomenon of spurious cycles is known as the Slutsky effect.

Autoregressive (AR) Process

Suppose again that {e,} is a purely random process with mean zero and variance a. Then the process {X,} given by

X, = a,A,„, + «2,-2 + • • • + ,. , + e, (13.2)

is called an autoregressive process of order r and is denoted by AR(r). Since the expression is like a multiple regression equation, it is called "regressive." However, it is a regression of X, on its own past values. Hence it is autoregressive.

In terms of the Lag operator L, the AR process (13.2) can be written as X, = (aL + ajL + • • + a,U)X, + e,



a, ± Va, + 4a2 This gives

a, -t- a2 < 1

a, - a2 > 1 (13.4)

a2l < 1

[The conditions are similar to the conditions (13.1) derived for the invertibility of the MA(2) process.]

In the case of the AR(2) process we can also obtain the pik) recursively using the Yule-Walker equations. We know that

p(0) = 1 and p(l) = a,p(0) + a2p(-l)

= a,p(0) + a2p(l) or p(l) =

1 - «2 Thus

p(2) = a,p(l) + a2p(0) = 7 +

1 - «2

p(3) = a,p(2) + a2P(l) = ii + «,«2

1 -

and so on.

As an example, consider the AR(2) process

X, = 1.0Z, , - 0.5X, 2 + £,

Here a, = 1.0 and a2 = -0.5. Note that conditions (13.4) for weak stationarity are satisfied. However, since aj + < 0 the roots are complex and p{k) will

the p{k) are. To do this we multiply equation (13.2) throughout by , , take expectations of all the terms and divide throughout by var{X), which is assumed finite. This gives us

p(k) = a,p(/: - 1) + • • • -t- { - r)

(substituting = I, 2, . . . , r and noting p{k) = p( - k) we get equations to determine the r parameters a,, aj, . . . , a. These equations are known as the Yule-Walker equations. To illustrate these procedures we will consider an AR(2) process

X, = a,A, -I- a2X,2 +

IT, and are the roots of the equation

- a,r - «2 = 0

Thus 1-17,1 < 1 implies that



P(2)

= 0.1666

P(3)

= -0.1666

P(4)

= -0.25

P(5)

= -0.1666

p(6)

= -0.0416

P(7)

= 0.0416

P(8)

= 0.0624

P(9)

= 0.0416

P(10)

= 0.0104

P(ll)

= -0.0104

P(12)

= -0.0156

P(12)

= -0.0104

This method can be used whether the roots are real or complex. A plot of this correlogram is left as an exercise.

Autoregressive Moving-Average (ARMA) Processes

We will now discuss models that are combinations of the AR and MA models. These are called ARMA models. An ARMA(p, q) model is defined as

X, = , , + • • • + , + e, + P,e, , + • • • + pe, ,

where {e,} is a purely random process with mean zero and variance a. The motivation for these methods is that they lead to parsimonious representations of higher order AR(p) or MA{q) processes. Using the lag operator L, we can write this as

, = e(L)e,

where (£) and e(L) are polynomials of orders p and q, respectively, defined as

{ ) = 1 - a,L - aiL • aL" e(L) = 1 + p,L + fiiL + • • • + p/.«

For stationarity we require that the roots of = 0 Ue outside the unit circle. For invertibiUty of the MA component, we require that the roots of e(F) lie outside the unit circle. For instance, for the ARMA(2, 2) process these conditions are given by equations (13.1) and (13.4). The acvf and acf of an ARMA model are more complicated than for an AR or MA model.

We will derive the acf for the simplest case: the ARMA(1, 1) process

X, = a,X, i + e, + P,e, , In terms of the lag operator L this can be written as

X, - ajX,, = e, + p,e, ,

(1 - a,L)X, = (1 + p,L)e,

be a sinusoidal function. A convenient method to derive p{k) is to use the recurrence relation (also known as the Yule-Walker relation)

p(A) = p(k - 1) - 0.5p(A: - 2)

noting that p(0) = 1 and p(l) = a,/(l - aj) = 0.6666. We then have



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [ 180 ] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]