back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [ 160 ] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


160

= fix + yz +

where we use a proxy p for z, a question that has often been asked is whether it is better to leave out p completely if we are interested in estimating p. Assuming that p falls in the category of the usual errors in variables, it has been shown that the bias in the estimator for p is reduced if we leave p in the equation. On the basis of this result it has been argued that even a poor proxy is better than none. However, this conclusion does not necessarily hold good if we take into account the variances of the estimators, or if the proxy is a dummy variable, or if the proxy does not fall in the category of the usual errors in variables. The conclusion also does not follow if x itself is measured with error.

9. In case we are interested in the coefficient y, a question often arises whether the use of the proxy p gives us the correct sign. A criterion for determining this is given in equation (11.20).

10. If there are many explanatory variables that depend on the same error-ridden variable, sometimes we can use the dependent variables in the other equations as instrumental variables for the estimation of the parameters in each equation. An example of this is given in Section 11.7. Furthermore, all these assumptions about bias are invalid in the presence of correlated errors (see Section 11.7).

Exercises

1. Explain the meaning of each of the following terms.

(a) Errors in variable bias.

(b) Bounds for parameters.

(c) Reverse regression.

(d) Grouping methods.

(e) Proxy variables.

2. Examine whether the following statements are true, false, or uncertain. Give a short explanation. If a statement is not true in general but is true under some conditions, state the conditions.

(a) Errors in variables lead to estimates of the regression coefficients that are biased toward zero.

(b) In errors-in-variables models we can always get two estimators p, and p2 such that

pUm Pi < p < plim 02

Thus, even though we cannot get a confidence interval, we can get bounds for the parameters that will serve the same purpose.

(c) Grouping methods give consistent (although inefficient) estimators for the regression parameters in errors-in-variables models. Since they are very easy to compute, they should be used often.

(d) It is always desirable to use even a poor proxy than to drop an error-ridden variable from an equation.



EXERCISES 473

(e) If we have an unobserved variable in an equation and we are interested in the sign of its coefficient, we should always use a proxy, since the estimated coefficient of the proxy variable will give us the correct sign.

(f) In regressions of income on schooling where schooling is measured with error, omitting variables that measure ability will overestimate the effect of schooling on income.

(g) In part (f) the bias in the estimator of the effect of schoofing on income can be reduced if we include a proxy like test score for abiUty.

(h) In part (g). if ability is measurable, then we should always include the measure available, in the earnings function.

(i) In an analysis of discrimination in salaries an investigator finds that the direct regression shows discrimination, whereas the reverse regression shows reverse discrimination. This proves that there is no evidence of discrimination in salaries.

3. Consider a regression model

= tto + oiiX, + «2-2 + +

The variable jc, is not observed, but we use a proxy p for it. Let a, be the estimator of a, obtained from the multiple regression equation. If you are told that the correlation between x, and p is at least 0.8, explain how you will determine whether a, has the correct sign (same sign as a,).

4. Consider the regression model

= PX, + +

JC, is not observed. The observed value is A, = x, + e, where e is uncorrelated with x„ , and u. Let 7 = var(e)/var(A,). Suppose that we drop x from the equation. Let the OLS estimator of p, be p,. Show that

plim p, = p, - 7P1 + P2b2,

where fcj, is the regression coefficient from a regression of x, on Z,. If p, is the estimator of p, from a regression of on A", and Xj, show that

phm 3, = P, -

where r is the correlation between Xy and xj. [This is equation (11.4) derived in the text.]

5. In Exercise 4 compute the two probability limits if the true equation is

= l.Ox, -b 0.5x2 +

7 = 0.1 vaA-,) = var(x2) = 9 r = 0.5 What do you conclude from these results?



Diagnostic Checking, Model Selection, and Specification Testing

12.1 Introduction

12.2 Diagnostic Tests Based on Least Squares Residuals

12.3 Problems with Least Squares Residuals

12.4 Some Other Types of Residuals

12.5 DFFITS and Bounded Influence Estimation

12.6 Model Selection

12.7 Selection of Regressors

12.8 Implied F-Ratios for the Various Criteria

12.9 Cross-Validation

12.10 Hausmans Specification Error Test

12.11 The Plosser-Schwert-White Differencing Test

12.12 Tests for Nonnested Hypotheses Summary

Exercises

Appendix to Chapter 12



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [ 160 ] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]