back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [ 68 ] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


68

RSS,/(«, - k)

which has an f-distribution with d.f. «2 and «, - k. We have already derived the expressions to prove this. We have shown that (N - N])N] = 0. Also, N, is idempotent and of rank (n, - k). Hence (N - N;)(N - Nj) = ¹ - n;N -NN; + = N - n; - N; + N; = N - n;. Thus N - n; is idempotent. Rank(N - Ni) = Tr(N - N]) = (n - k) - {n, - k) = n.

RRSS - RSS, = u(N - N;)u RSS, = uN;u

Hence the required resuh follows.

Omitted Variables and Irrelevant Variables (Section 4.9} Suppose that the true model is

= xp - u X is an X /: matrix

Instead, we estimate

= Z6 + V Z is an « X r matrix

r can be less than, equal to, or greater than k. The variables in Z may include some variables in X. We then have

= (ZZ)-Zy

= (ZZ)-Z(Xp + u) = Pp + (ZZ) Zu

Since £(u) = 0, we have £(6) = Pp. P = (ZZ) ZX is the matrix of regression coefficients of the variables X in the true model on the variables Z in the misspecified model. As an example, suppose that the true equation is

= + 2-2 + «

Instead, we estimate

= 8,x, + 82 :, + V

Then P is obtained by regressing each of jc, and Xj on x, and x. The regression of x, on x, and X3 gives coefficients 1 and 0. The regression of X2 on jc, and X3 gives coefficients (say) and /723- These regressions are known as the auxiUary regressions. Hence we get

0 ,

Predictive Test for Stability

If «2 < k, the regression equation cannot be estimated with «2 observation. In this case the predictive test for stabihty is to use

(RRSS - RSS,)/«2



. Hence E

, or £(P,) = , and 2) = 0. Thus even if

some "irrelevant" variables are included, we get unbiased estimates for the coefficients of the "relevant" variables.

Prior Adjustment (Section 4.4) Consider the multiple regression model

= X,p, + X2P2 + u

Let p, be the estimator of p, from this equation. Suppose that instead of this, we consider adjusting both and X, by removing the effect of Xj on these variables. Let the residuals from a regression of on X2 be denoted by y* and the residuals from a regression of X, on Xj be denoted by XJ. Now regress the adjusted y* on the adjusted X\. Let this regression coefficient be b. We shall show that

That is, if we want to remove the effect of X2 on and X, before running a regression on the adjusted variables, we can get the same result by including X2 as an additional explanatory variable in the regression of on X,. Usually, X2 is a trend variable or seasonal variables.

Proof: Let N = I - X2(X2X2)~X,. Then as we showed earlier, the residual y* = Ny and the residual X; = NX,. Hence b = (X;X;)~4X;y*) = (XNX,)~X,Ny. We have to show that we get the same expression for p,. We have (XX)P = Xy, which can be written as

X,X,p, + X,X2p2 = X,y

. , + X2X2P2 = X2y

The second equation gives

P2 = (X2X2)- [X2y - X2X,p,]

Substituting this in the first we get

X,X,p, + X,X2(X2X2)- [ - X2X,p,] = X,y or(X,NX,)p, = X,Ny. Thus p, = b.

or £(S) = , + b2i2 and ) = / -Suppose that Z includes irrelevant variables, so that the true equation is

Y = X,p, + u

and the misspecified equation is

= X,p, + X2P2 + V In this case, the matrix P is the regression coefficients of X, on X, and Xj. These



Data Sets

Table 4.7 Sale Prices of Rural Land"

Price

5,556

1.00

12.1

36.0

5,236

1.00

12.1

38.2

5,952

1.00

12.0

21.0

7,000

0.00

16.0

40.0

3,750

0.00

15.5

40.0

7,000

0.00

13.7

20.0

5,952

0.00

14.5

21.0

2,009

0.00

16.1

656.0

2,583

1.00

15.2

60.0

2,449

0.00

15.5

156.0

2,500

0.50

15.2

40.0

3,000

0.00

15.5

13.0

3,704

0.00

13.5

27.0

3,500

0.00

15.5

10.0

3,500

0.00

17.5

20.0

4,537

1.00

18.0

38.0

3,700

0.00

17.2

2,020

1.00

34.2

22.0

5,000

0.00

11.1

4,764

0.00

14.2

237.6

1.00

14.2

237.6

3,500

1.00

11.1

20.0

15,200

1.00

14.7

4,767

0.00

12.1

30.0

16,316

1.00

14.8

9,873

1.00

14.8

5,175

0.25

14.2

40.0

3,977

0.00

11.4

5,500

0.20

18.5

10.0

7,500

0.00

16.5

4,545

1.00

16.8

97.0

3,765

0.72

18.7

178.0

5,000

1.00

18.4

10.3

3,300

0.00

16.2

525.7

5,500

0.00

18.0

5,172

0.00

15.0

29.0

3,571

0.00

15.1

21.0

4,000

0.00

18.2

10.0

4,000

0.00

18.4

15.0



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [ 68 ] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]