back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [ 64 ] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


64

2. The illustrative examples given at the end of Section 4.3 show how sometimes we can get "wrong" signs for some of the coefficients and how this can change with the addition or deletions of variables. (We discuss this in more detail in Section 4.10.) The example of gasoHne demand represents one where the results obtained were poor and further analysis with the data was left as an exercise.

3. Section 4.6: Simple, Partial, and Multiple Correlations

In multiple regression it is important to note that there is no necessary relationship between the simple correlation between two variables and x and the partial correlation between these variables after allowing for the effect of other variables (see Table 4.2 for an illustration). There is, however, some relationship between R and the simple and partial rs. This is given in equation (4.12). Also partial r = t-l{t + d.f.) is a useful relationship. Some examples are given to illustrate these relationships.

4. Section 4.7: Prediction

In the case of the simple regression model (Section 3.7) the variance of the prediction increased with the distance Xo from x. In the case of prediction from the multiple regression model this is not necessarily the case. An example is given to illustrate this point. Again, as in the case of the simple regression model, we can consider prediction of or prediction of ). The predicted value will be the same in both the cases. However, the variance of the prediction error will be different. In the case of prediction of ) we have to subtract CT from the corresponding expression for the prediction of - Note that we did not discuss the prediction of E{y here as we did this in the simple regression case (Section 3.7).

5. Section 4.8: Tests of Hypotheses

Tests of single parameters and single linear functions of parameters will be t-tests. Tests of several parameters and several linear functions of parameters are F-tests. These are both illustrated with examples. Note again that there can be conflicts between the two tests. For instance, the r-statistics for each coefficient can be nonsignificant and yet the F-statistic for a set of coefficients can be significant.

6. Section 4.9: Omitted Variables and Irrelevant Variables

The omission of relevant variables produces biased estimates. Expressions are given in equations (4.16)-(4.18) for the omitted variable bias. The variance of the estimated coefficient will be smaller, although the estimated variance (or standard error) need not be. These points are illustrated with examples. The



Exercises

More difficult exercises are marked with an *.

1. Define the following terms.

(a) Standard error of the regression.

(b) i?2 and R\

(c) Partial

(d) Tests for stability.

(e) Degrees of freedom.

(f) Linear functions of parameters.

(g) Nested and nonnested hypotheses.

(h) Analysis of variance.

case with the inclusion of irrelevant variables is different. There is no bias. However, the variance of the estimated coefficients increases. Thus we get unbiased but inefficient estimators. These are all only statistical guideUnes regarding omission of relevant and inclusion of irrelevant variables.

7. Section 4.10:

The addition of explanatory variables always increases R. This does not mean that the regression equation is improving. The appropriate thing to look at is the estimate of the error variance. An equivalent measure is R, the value of R adjusted for the loss in degrees of freedom due to the addition of more explanatory variables. It is given by equation (4.20). A procedure usually followed is to keep on adding variables until the R stops increasing. Apart from the lack of any economic rationale, there are some pitfalls in this procedure. The R might increase by the addition (or deletion) of two or more variables even though it might not if one variable is added (or dropped) at a time. Some rules are given for the prediction of sign changes in the estimates of the coefficients of the retained variables when a variable is deleted. Although maximization of R and mechanical deletion of nonsignificant variables have serious pitfalls, these rules provide some useful predictions.

8. Section 4.11: Test for Stability

In multiple regression analysis we are often concerned with the stability of the estimated relationships across two samples of sizes aj, and aj,. We discuss and illustrate two tests: the analysis-of-variance test (AV test) and the predictive test (Chow test). In practice it is desirable to use both tests. If either aj, or aj2 is not greater than the number of regression parameters estimated, the AV test cannot be used but the Chow test can be. However, in this case the Chow test is not a test for stability. It is merely a test for unbiasedness of predictions.



= 20

= 25

= 20

= 23

= 25

= 28

= 80

= 100

= 120

= 150

= 200

= 250

2. In a multiple regression equation, show how you can obtain the partial rs given the -ratios for the different coefficients.

3. In the multiple regression equation

= a + PiX, + 21 + P3X3 +

Explain how you will test the joint hypothesis Pi = 2 and = 1.

4. The following regression equation is estimated as a production function.

log Q= 1.37 + 0.632 log + 0.452 log L = 0.98

(0 257) (0.219)

co\ibfc, bj) = -0.044. The sample size is 40. Test the following hypotheses at the 5% level of significance.

(a) b = fci.

(b) There are constant returns to scale.

5. Indicate whether each of the following statements is true (T), false (F), or uncertain (U), and give a brief explanation or proof.

(a) Suppose that the coefficient of a variable in a regression equation is significantly different from zero at the 20% level. If we drop this variable from the regression, both R and R} will necessarily decrease.

(b) Compared with the unconstrained regression, estimation of a least squares regression under a constraint (say, P2 = ) will result in a higher R if the constraint is true and a lower R if it is false.

(c) In a least squares regression of on x, observations for which x is far from its mean will have more effect on the estimated slope than observations for which x is close to its mean value.

6. The following estimated equation was obtained by ordinary least squares regression using quarterly data for 1960 to 1979 inclusive ( = 80).

y, = 2.20 + 0.104x„ + 3.482, + 0.34 : ,

(3.4) (0.005) (2 2) (0.15)

Standard errors are in parentheses, the explained sum of squares was 112.5, and the residual sum of squares was 19.5.

(a) Which of the slope coefficients are significantly different from zero at the 5% significance level?

(b) Calculate the value of R for this regression.

(c) Calculate the value of R ("adjusted R").

1. Suppose that you are given two sets of samples with the following information:

Sample 1 Sample 2



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [ 64 ] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]