back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [ 136 ] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


136

EXERCISES 399

8. Examine whether each of the following statements is true (T), false (F), or uncertain (U), and give a short explanation.

(a) In a simultaneous equation system, the more the number of exogenous variables the better.

(b) If the multiple correlations of the reduced-form equations are nearly 1, the OLS and 2SLS estimates of the parameters will be very close to each other.

(c) In the 2SLS method we should replace only the endogenous variables on the right-hand side of the equation by their fitted values from the reduced form. We should not replace the endogenous variable on the left-hand side by its fitted value.

(d) Which variables should be treated as exogenous and which as endogenous cannot be determined from the data.

(e) An estimation of the demand function for steel gave the price elasticity of steel as +0.3. This finding should be interpreted to mean that the price elasticity of supply is at least +0.3.

(f) Any equation can be made identified by deleting enough exogenous variables from the equation or adding enough exogenous variables to the other equations.

(g) Any variable can be endogenous in one equation and exogenous in another equation.

(h) Some simultaneous equation systems can be estimated by ordinary least squares.

(i) If the R- from the 2SLS method is negative or very low and the R from the OLS method is high, it should be concluded that something is wrong with the specification of the model or the identification of that particular equation.

(j) The R- from the OLS method will always be higher than the R from the 2SLS method, but this does not mean that the OLS method is better.

(k) In exactly identified equations, the choice of which variable to nor-

maUze does not matter. (1) In exactly identified equations, we can normalize the equation with respect to an exogenous variable as well.

9. Consider the model

0 = Po + 3iF„, + + demand function Q = + a,F„. + + supply function

Estimate this model by 2SLS, instrumental variable, and indirect least squares methods using the data in Table 9.2 (transform all variables to logs). Would you get different results using the three methods?

How would you choose the appropriate normalization (with respect to Q or P„)? Does normalization matter in this model? 10. Estimate the supply equation in the model of the Australian wine industry in Section 9.5 if it is normalized with respect to P„,.



where y.

= ag

1 vector of observations on the endogenous variables

= a A: X

1 vector of observations on the exogenous variables

= ag

1 vector of errors

= ag

g matrix of coefficients of the endogenous variables

= ag

matrix of coefficients of the exogenous variables

In the example given in Section 9.4, g = 1 and /: = 3. is the 7 x 7 matrix consisting of the first seven columns, and is the 7 x 3 matrix consisting of the last three columns.

We assume that the matrix is nonsingular. Hence we can solve (9A.1) for y, to get

y, = -B-Tx, + B-u, (9A.2)

= , + v.

This equation is called the reduced form. Equation (9A.1) is called the structural form. From (9A.2) we have

- ~ = or + = 0 and v, = Bu, (9A.3)

We assume that the errors u, have zero mean, are independent, and have a common covariance matrix £(u,u,) = S.

To discuss identification, without any loss of generaUty, consider the first equation in (9A.1). Let be the first row of and 7 the first row of . Partition these vectors each into two components corresponding to the included and excluded variables in this equation. We have

y = [7172]

; corresponds to g, included and 2 corresponds to gj excluded endogenous variables (g, 4- = S)- Similarly, y\ corresponds to ky included and 72 to ki excluded exogenous variables (/:, + = k). In the first equation in the example in Section 9.4, we have g, = 3, g2 = 4, = 1, and ki = 2.

Now partition the matrices and also conformably to the partitioning of and 7. We have

Pi 0

B, B2

and =

, 2

Appendix to Ciiapter 9

Necessary and Sufficient Conditions for Identification (Section 9.4)

Consider a simultaneous equations model with g endogenous variables and li exogenous variables. In matrix notation we can write the model as

By, + , = u, t = \,2,. . . ,T (9A.1)



0 0

D is the matrix corresponding to the missing endogenous and exogenous variables. The necessary and sufficient condition for identification, also known as the rank condition, is

Rank(D) = g - I

The proof of this proposition follows from noting that if rank [B2 Fj] < -1, there will exist a nonnull vector a [ B, 2] = [0 0]. In this case we can find a linear combination of the {g - I) equations, with coefficients given by the elements of a, which, when added to the first equation, results in an equation that "looks like" it. Thus it is not possible to identify the parameters of the first equation.

Methods of Estimation (Sections 9.4 and 9.5)

We shall consider the estimation of a single equation by least squares methods. Let the particular equation consisting of g, endogenous variables and exogenous variables be written as

= + X,7 + u = Z,8 + u (9A.4)

where = T x 1 vector of observations on the endogenous variable chosen for normalization (i.e., to have coefficient 1)

Y, = X (g, - 1) matrix of observations on the included endogenous variables

X, = X / , matrix of observations on the included exogenous variables

8 = Ifiy] is the vector of parameters to be estimated

Z, = [Y,X,]

= X 1 vector of errors

We assume that £(uu) = .

Let X be the X matrix of observations on all the exogenous variables in the system. Equation (4) is identified by the order condition only if the number of excluded variables is > g - 1; that is, gz + kj (g - 1). We shall assume that this is satisfied.

Since Y and u are correlated, the OLS estimators of the parameters in (9A.4) are not consistent. To get consistent estimates, we use instrumental variables for Y. Let us consider Y, where Yj is the predicted value of Y, from the reduced-form equations. Then

t, = X(XX)"XY, (9A.5)

Also let V, be the estimated residuals from the reduced form, so that Y, = Y, + V,. Then we have XV, = 0 (the residuals from an OLS estimation are uncorrelated with the regressors). Hence we get

Consider the matrix



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [ 136 ] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]