back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [ 88 ] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


88

-D. F. Hendry, "Comments on the Papers by Granger-Newbold and Sargent-Sims." in New Methods in Business Cycle Research (Minneapolis: Federal Reserve Bank of Minneapolis, October 1977).

-D. F. Hendry and P. K. Trivedi, "Maximum Likelihood Estimation of Difference Equations with Moving Average Errors: A Simulation Study," The Review of Economic Studies, Vol. 39, April 1972, pp. 117-145.

One other question that remains is that of moving average errors and ARMA errors. Estimation with moving average errors and ARMA errors is more complicated than with AR errors. Moreover, Hendry- and Hendry and Trivedi argue that it is the order of the error process that is more important than the particular form. Thus from the practical point of view, for most economic data, it is just sufficient to determine the order of the AR process. Thus if a significant DW statistic is observed, the appropriate strategy would be to try to see whether the errors are generated by a higher-order AR process than AR(1) and then undertake estimation.

Autocorrelation Caused by Omitted Variables

Case 2, serial correlation being caused by omitted variables, is rather difficult to tackle. It is often asserted that the source of serial correlation in the errors is that some variables that should have been included in the equation are omitted and that these omitted variables are themselves autocorrelated. However, if this is the argument for serial correlation, and it is an appealing one, one should be careful in suggesting the methods that we have discussed until now. Suppose that the true regression equation is

y, = Po + + + ,

and instead we estimate

y, = Po + p,x, -I- V, (6.20)

Then since v, = PjX? -I- u„ if x, is autocorrelated, this will produce autocorrelation in v,. However, v, is no longer independent of jc,. Thus not only are the OLS estimators of Po and p, from (6.20) inefficient, they are inconsistent as well.

As yet another example, suppose that the true relation is

y, = p,x, + Pa + u, (6.21)

and we estimate

y, = + w, (6.22)

Again, if z, are autocorrelated, w, will also be. But if z, and jc, are independent, the methods we have discussed earlier are applicable. Thus, to justify the methods of estimation we have discussed, we have to argue that the autocorrelated omitted variables that are producing the autocorrelation in the errors, are un-



correlated with the included explanatory variables. Further, if there are any time trends in these omitted variables, they will produce not only autocorrelated errors but also heteroskedastic errors.

In equation (6.21) let us assume that the errors u, are independent with a common variance . However, we estimate equation (6.22) and compute the DW statistic d. What can we say about it? Note that since the least squares residuals are always uncorrelated with the included variables (by virtue of the normal equations), the DW statistic d is determined not by the autocorrelation in z, but the autocorrelation in z,, which is that part of z, left unexplained by

Consider a regression of z, on x,. Let the regression coefficient be denoted by b. Then z, = bx, + z",, where z] is the residual from a regression of z, on jc,. Equation (6.21) can be written as

y, = iX, + 2ibx, + z,) + u, 23)

= (p, + p2b)x, + W,

where w, = Pj,* + u,.

If we estimate (6.22) by OLS and p, is the OLS estimator of P,, then ) = Pi + and the residual w, would be estimating PjZ,* + u,. Let var(z,*) = a-l and cov(zJ, z i) = p*o-?. Then since cov(,z,, u,) = 0 we have

cov(p2z; + u„ p2z; , + «,-,) = Pip*tTi

var(P22,* + u,) = Plul + The first-order serial correlation in w, would be*

1 + ,1( 1

If d is the DW statistic from OLS estimation of (6.22), then

plim = 2(1 - pj

Note that the observed serial correlation depends on the serial correlation in z,, not z, (the omitted variable), and the variance ratio aVpja?. If this variance ratio is large, then, even if p* is high, p„ can be small.

One can test for omitted variables using the RESET test of Ramsey or Whites test outlined in Section 5.2. If the DW test statistic is significant but these tests also show significance, the appropriate strategy would be to estimate the model by some general procedure like the procedure described in Section 5.4 rather than use a transformation based on the estimated first-order autocorrelation.

This formula has been derived in a more general case in M. Chaudhuri, "Autocorrelated Disturbances in the Light of Specification Analysis," Journal of Econometrics, Vol. 5, 1977, pp. 301-313.



J. D. Sargan, "Wages and Prices in the United Kingdom: A Study in Econometric Methodology," in P. E. Hart, G. Mills, and J. K. Whitaker (eds.). Econometric Analysis for National Economic Planning, Colston Papers 16 (London: Butterworth, 1964). pp. 25-54. "D. F. Hendry and G. E. Mizon, "Serial Correlation as a Convenient Simplification, Not a Nuisance: A Comment on a Study of the Demand for Money by the Bank of England," Economic Journal, Vol. 88, September 1978, pp. 549-563.

J. G. Thursby, "A Test Strategy for Discriminatmg Between Autocorrelation and Misspecifi-cation in Regression Analysis." Review of Economics and Statistics, Vol. 63, 1981, pp. 117-123 considers the use of the DW, Ramseys RESET, and Sargans test together. This is useful for warning against autoregressive transformations based on the DW statistic but does not tell what the estimation strategy should be after the tests.

Serial Correlation Due to Misspecified Dynamics

In a seminal paper published in 1964, Sargan pointed out that a significant DW statistic does not necessarily imply that we have a serial correlation problem. This point was also emphasized by Henry and Mizon." The argument goes as follows. Consider

y, = px, + M, with u, = pM, i + e, (6.24)

and e, are independent with a common variance u. We can write this model as

y, = py,-i + - ppx, , -I- e, (6.25)

Consider an alternative stable dynamic model:

y, = P,y, , + + , , + e, p, < 1 (6.26)

Equation (6.25) is the same as equation (6.26) with the restriction

P.p, + p, = 0 (6.27)

A test for p = 0 is a test for P, = 0 (and Pj = 0). But before we test this, what Sargan says is that we should first test the restriction (6.27) and test for p = 0 only if the hypothesis : P.Pj + = 0 is not rejected. If this hypothesis is rejected, we do not have a serial correlation model and the serial correlation in the errors in (6.24) is due to "misspecified dynamics," that is, the omission of the variables y,, and x,, from the equation.

The restriction (6.27) is nonlinear in the Ps and hence one has to use the Wald test or the LR or LM tests. If the DW test statistic is significant, a proper approach is to test the restriction (6.27) to make sure that what we have is a serial correlation model before we undertake any autoregressive transformation of the variables. In fact, Sargan suggests starting with the general model (6.26) and testing the restriction (6.27) first, before attempting any tests for serial correlation.

In this case there is, in general, no exact f-test as in the case of linear restrictions. What we do is Hnearize the restriction by a Taylor series expansion and use what is known as a Wald test, which is an asymptotic test (or use the LR or LM tests).



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [ 88 ] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]