back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [ 143 ] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


143

During recent years error correction models have been more popular in empirical work than partial adjustment models." The empirical evidence and economic interpretation is in favor of the former model. Also, one can have a test for the partial adjustment model by testing 6 = 7 in (10.18).

10.7 Partial Adjustment with Adaptive Expectations

When adjustment and expectational lags are combined we may have some problems of identifying their separate effects. We can see the problem by considering the partial adjustment model with adaptive expectations. We can consider the error correction model as well but the partial adjustment model is simpler. Suppose that

= desired capital stock at the beginning of period t S* = expected sales in period /

Kf = po + p.: +«, (10.19)

The partial adjustment model states that

K, - K, , = 6(Kf - K,,) 0 < < 1 and substituting for Kf, we get

K, = 06 + (1 - 6) ,„, + (B,65; + bu, (10.20) If we use the adaptive expectations model

5; ~ S;y = X(5, , - 5; ,) 0 < X < 1

then, lagging equation (10.20) by one period, multiplying it by (1 - X) subtracting this from (10.20), and simplifying, we get

K, = 8 0 + (1 - 6 + 1 - X)/r, , (10.21)

- (1 - 6)(1 - k)K,2 + XP,S5, , + V,

where

v, = 8[m, - (1 - X)m,,]

Now if we added the error term to the final simplified equation (10.21) rather than (10.20), it is easy to see from (10.21) that 8 and X occur symmetrically and hence there is some ambiguity in their estimates, which have to be obtained from the coefficients of and K,2-

"An example of this is D. F. Hendry and T. von Ungern-Sternberg, "Liquidity and Inflation Effects on Consumers Expenditure," in A. S. Deaton (ed.). Essays in the Theory of Measurement of Consumers Behavior (Cambridge: Cambridge University Press, 1981).



Note that this ambiguity arises only if an error term is superimposed on the final simplified equation (10.21), and the equation is estimated by ordinary least squares assuming that the error terms are serially uncorrelated. On the other hand, we can estimate equation (10.20) in its distributed lag version." For this we use the procedures for the estimation of adaptive expectations models in the distributed lag form as described in Section 10.4. Thus if the model is estimated in the distributed lag form, there is no ambiguity in the estimates of 6 and k.

The preceding discussion on partial adjustment models with adaptive expectations illustrates the point that specification of the error term cannot be done in a cavalier fashion. The estimation procedures and whether any parameters (like S and X. in our example) are unambiguously estimable or not, depend on the specification of the error term at different stages of the modeUng process.

Of course, one can argue that there is no reason why the errors u, in (10.19) should be assumed serially independent. If for some reason one starts with (10.21) and a general specification for the error term v„ the ambiguity in the estimation of S, the speed of adjustment, and k, the reaction of expectations, remains. However, in this case, if the equation (10.19) has some other explanatory variables, the parameters X. and 6 are identified, for example, suppose that (ignoring the error term which we will introduce at the end after all simplifications)

Ki = fi, + p,5: + fizL,

where L, is the amount of labor hired. Then, on simplification, and adding an error term v, at the end, we get

K, = PoSX + (1 - 8 + 1 - X)J,„,

- (1 - 8)(1 - k)K, 2 + P,8X5,„,

+ 8P2F, - SPzd - X)F, + V, (10.22)

In this equation 6 and X do not occur symmetrically. Suppose that we write the equation as

K, = a, + aiKii + ,2 + a45/~i + "sF, + eF,-! + v, (10.23)

Then

X = 1 +

But the problem is that we get two estimates of 8. From the coefficients of , we get

8 = 2 - a, - X

"This is equivalent to estimating (10.21) with a moving average error that depends on \, as specified in the equation for v,.



1 - \

The problem is that equation (10.23) has six parameters and our model has only five parameters Po, Pi, P2, X, and 8. Note, however, that given K, equation (10.22) can be written as

K, = P08X + (1 - 8)J?,„, + P,8X5,„, + P28L, + V, (10.24)

where

K, = K,-(l - m,-i L, = L,-{\ - X)L, ,

The estimation of (10.24) gives us unique estimates of 8, p,,. Pi, and P2. Thus we can use the following two-step procedure:

1. Estimate (10.23) and get an estimate of X.

2. Use this to construct K, and L, and then estimate equation (10.24) to get unique estimates of 8, Po, p,, and P2.

An alternative search method is the following. Choose different values of X in the interval (0, 1). For each value of X run the regression of K, on J?, „ S,,, and L,. Then the value of X for which the residual sum of squares is minimum is the best estimate of X and the corresponding estimates of 8, Po, Pi, and P2 are the desired estimates of the parameters. Actually, we can conduct the search in two steps, first at steps of 0.1 and then at steps of 0.01 around the minimum given in the first step. We have discussed a similar procedure in Section 10.4. These are all examples where an equation that is nonUnear in the parameters can be reduced to an equation linear in the parameters conditional on one of the parameters being given.

10.8 Alternative Distributed Lag Models: Polynomial Lags

We saw in previous sections that the adaptive expectations model (10.11) implies that the expectation x], is a weighted average of x, and past values of x„ with geometrically declining weights. We also saw that the partial adjustment model (10.16) implies that y, is a weighted average of yf and past values of yf again with geometrically declining weights. Since the weights of the lagged variables all sum to 1, and they are usually all positive, it is customary to compare these weights to the successive terms in a probability distribution. The weights p, in equations like (10.7) and (10.8) are said to form a lag distribution. The geometrically declining weights correspond to the geometric distribution. As we mentioned in Section 10.3, this type of lag is also called the Koyck lag, named after L. M. Koyck, who first used it.

and from the coefficient of J,„2 we get

8=1+



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [ 143 ] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]