back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [ 181 ] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


181

\ \ - a} ) 1 -

Also

COv(A:„ = [(a + p) + a(a + P) + a\a + p) + • • -Jct

= a + P +

(g + p)2a\ , 1 - «2

(g -b P)(l -f gp) 2

--"

Hence

cov(;y„ A-,„,) (g + P)(l + gp) var(A:,) 1 + p2 + 2gp

Successive values of ( ) can be obtained from the recurrence relation p{k) = ap(k - 1) for 2. For the AR(1) process of p(l) = a. It can be verified that p(l) for ARMA(1, 1) process is > or < g depending on whether p > 0 or < 0, respectively.

Autoregressive Integrated Moving-Average (ARIMA) Processes

In practice, most time series are nonstationary. One procedure that is often used to convert a nonstationary series to a stationary series is successive differencing. Let us define the operator = 1 - L, so that AX, = X, - A, ,.

AX, = (X, - X, i) - {X,, - A, 2) and so on

Suppose that AX, is a stationary series that can be represented by an ARMA (p, q) model. Then we say that X, can be represented by an ARIMA(/7, d, q) model. The model is called an integrated model because the stationary ARMA model that is fitted to the differenced data has to be summed or "integrated" to provide a model for the nonstationary data. Actually, even if there is no need for a moving-average component in modeling X„ the procedure of differencing X, will produce a moving-average process (the Slutsky effect mentioned in our discussion of the MA process).

~ 1 - aL

= (I + PDd + aL + + • • Oe,

= [1 -t- (a + P)L -b a(a -t- PIL -t- a\a + p)L + • • -je, Since e, is a pure random process with variance we get varCA-,) = [1 -h (a + p) + a\a + p) + • •



Estimation of MA Models

We shall illustrate the estimation of MA models with a simple second-order MA process. Consider the MA(2) model

X, = i. + e, + p,e, , + P2e,-2

In the case of MA models, we cannot write the error sum of squares as simply a function of the observed xs and the parameters as in the AR models. What we can do is to write down the covariance matrix of the moving-average error and assuming normality, use the maximum likelihood method of estimation." An alternative procedure suggested by Box and Jenkins is the grid-search procedure. In this procedure we compute e, by successive substitution for each value of (p,, P2) given some initial values, say jl = x and e,, = e, = 0. We then have, for the MA(2) model,

e, = , - p,

E2 = X2 - p- - Pe,

e, = X, - IX - p,e,, - P2E,2 for

Thus successive values of e, can be generated and can be computed for each set of values of (P,, p,)- This grid search is conducted over the admissible range of values for (P,, P2) given by equations (13.1) and the set of values (P,, P2) that minimizes is chosen. This grid-search procedure is, of course, not very practicable if we have many parameters in the MA process. However, in practice one usually uses a low-order MA process or a low-order MA component in an ARMA process.

*For this see D. R. Osborne, "Maximum Likelihood Estimation of Moving Average Processes," Annals of Economic and Social Measurement, Vol. 5, 1976, pp. 75-87, and J. E. H. Davidson, "Problems with the Estimation of Moving Average Processes," Journal of Ecoiiometries. Vol. 19, 1981. pp. 295-310.

G. E. P. Box and G. M. Jenkins, Time Series Analysis, Forecasting and Control, rev. ed. fSan Francisco: Holden-Day, 1976), Chap. 7.

13.5 Estimation of AR, MA, and ARAAA Models

The estimation of AR models is straightforward. We estimate them by ordinary least squares by minimizing 2?- The only problem is that of the choice of the degree of autoregression. This is discussed in Section 13.7. There is also a loss in the number of observations used as the lag length increases. This is not a problem if we have a long time series.



Estimation of ARiViA Models

We can now consider the estimation of ARMA models. Again, the problem is with the MA component. Either we have to write down the covariance matrix for the errors in the MA component and use ML methods or use the grid-search procedure for the MA component. We shall discuss the latter procedure. Consider an ARMA(2, 2) model

X, = a,Z, , + . + e, + p,e, , + -

This can be written as

(1 - a,L - a2L)X, = e, -I- P,e, , + ..

X = -J- (e, + p,e, , + P2e, 2) (13.5)

1 - -

1

1 - a,L -

Multiplying both sides by (1 - ayL - aiV) we get

Z, - a,Z, - a2Z, 2 = e, (13.6)

Also, from (13.5) we have

X, = Z, + p,Z,, -h P2Z, 2

Z, = X,- p,Z, , - P2Z, 2

The grid-search procedure is as follows: Starting with Zq = Z =0, we generate successive values of Z, for different sets of values for (P,, P2) in the region given by equations (13.1) as follows:

Z, = X,

Z2 = Z2 - p,z,

Z, = Z, - p,Z, , - P2Z,.2 for /S3

We then use the generated Z, to estimate the parameters (a,, «2) in (13.6) by ordinary least squares. We choose those values of (P,, P2) that minimize Ie]. The corresponding values a, and «2 give the estimates of a, and «2-

For ARIMA models the procedure described above is used after successively differencing the given series until it is stationary. We have discussed the grid-search procedure here. Given the current high-speed computers, it is possible



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [ 181 ] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]