back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [ 67 ] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


67

Tests of Significance

We shall now add the assumption that u~ N„ (0, la). We have from (4A.3),

X(0 - p) = X(XX)-Xu = Mu

and the estimated residual u is given by

u = - X0 = Xp + u - X0 = [I - X(XX) Xlu = Nu

where M = X(XX)-X and N = I - M. We shall use the properties of idem-potent matrices and the x-distribution stated in the Appendix to Chapter 2.

1. It can easily be verified that = M and = N. Thus M and N are idempotent matrices. Also, MN = 0.

2. Since M is idempotent, Rank(M) - Tr(M). Using the result Tr(AB) = Tr(BA), we get

Tr(M) = TrX(XX)-X = Tr(XX)-(XX) = Tr(I) = Ic

Thus Rank(M) = k. Similarly, Rank(N) = n - k.

3. Hence (1/a) uMu and (1/< - uNu have independent x-distributions with d.f. and in - k), respectively.

4. Now the residual sum of squares

uu = (y - X0) (y - X0) = ¹ = uNu

is independent of the regression sum of squares

(0 - p)XX(0 - p) = uMMu = uMu = uMu

„ (regression S.S.)/k . j. , ... j

Hence---, „ „-- has an F-distnbutton with degrees of free-

(residual S.S.)/(« -k)

dom and (n ~ k).

) = £-(0 - p)(0 - p) = (XX)-X£(uu)X(XX)-

= (XX) ct since E(uu) = /

The 0 is unbiased and has a covariance matrix (XX) < .

Now how do we show that this is minimum variance? Any other linear estimator must be of the form p* = 0 + Cy. Thenp* = p + exp + [(XX) X + C]u. Hence £(P*) = P + CXp. But if p* is an unbiased estimator for all values of p, we should have CX = 0.

V(p*) = E(p* - p)(p* - p):= [(XX)-"X + C]£(uu)[(XX)X + C]

Since £(uu) = Icr and CX = 0, this gives V(p) = (XX)-*! + (CC)cr. Hence, V(p*) s V(0). Thus 0 is BLUE. Note that to prove 0 is BLUE, we did not assume that the errors u, are normal. But to derive the sampling distribution of 0 we have to assume normality.



(1 - R)l{n - k) (1 - R)k

Note that in equation (4.14) we have (n - - I) because there is also a constant term in addition to the ps. We can then consider p to be a (/: + 1) vector, and the matrix X to be x (/: + 1), with the first column of X consisting of all elements = 1.

Tests for Stability

We shall derive the analysis of variance and predictive tests for stability discussed in Secton 4.11. Let us write

y, = XPi + U for the first n, observation 2 = X2P2 + U2 for the second «2 observation

Write

We assume the errors to be 1N(0, a) in both the equations. If p, = P2, we estimate the pooled regression equation

= xp + u for the « = ( , - ) observation

Let RSS, and RSS2 be the residual sum of squares from the two separate regressions and RRSS be the residual sum of squares from the pooled regression. (It is called "restricted" because of the restriction p, = P2.) We shall denote (RSS, + RSS2) by URSS (unrestricted residual sum of squares). We have to show that

(RRSS - URSS)/A: URSS/(rt - k)

has an F-distribution with d.f. k, n - k. Define

n; = I, - x,(x,x,)-x,

N2 = I2 - X2(X2X2)-X2

This result can be used to construct confidence regions for and also to apply any tests of significance. To test the hypothesis p = Po. we substitute the value Po for p in the test statistic above and use it as an F-variate. Whether or not the hypothesis is true, the denominator depends on p only and thus always has a x-distribution. The numerator has a x"-distribution only when the null hypothesis is true. When it is false, it has a noncentral x-distribution, and this is used to find the power of the test.

Since regression sum of squares = SJi and the residual of sum of squares = Syyil - R), we can also write the F-test as

R4k R\n - k)



where I, and I2 are identity matrices of orders «, and , respectively. Then

0 0

RSS, = u;N,u and RSSj = uNjU. If we define N; =

and ==

0 0

0 N2

as two n X n matrices, we can write RSS, = uNu and RSSj =

uNju. Note that NJN = 0. Also. RRSS = uNu, where N = I - X(XX)-X. We can write

where

N„ -

(XX)-[X,X2]

N„ N,2 N2, N22

N,2 = -X,(XX)-X2 N2, = -X2(XX)-X,

N22 = I2 - X2(XX)-X2

Define N* = n; + NJ so that we have URSS = uNu and RRSS = uNu. We shall show that:

1. (N - N*) and N* are both idempotent.

2. (N - N*) • N* = 0.

3. Tr(N*) = n~2k and Tr(N - N*) = k.

4. Hence (RRSS - URSS)/a2 and URSS/ct have independent x-distribu-tions with d.f. and n - 2k, respectively. From this the required F-ratio follows.

Proof: Since N\ and Nj are both idempotent, N* is easily seen to be idem-potent. If we prove (2), it is easy to show that N - N* is idempotent. Hence we shall prove result (2). We have

(n - n;)n; =

N, N,2

N, 0

(N„ - N,)N, 0 N2,N, 0

Since X,N, = 0, we have N21N, = 0 and N,,N, = N,. Since N, is idempotent, it follows that (N„ - N,)N, = 0. Thus (N - n;)NI = 0 or NN; = . Similarly, (N - n;)N; = or NN; = N;. Hence it follows that (N - NJ - NJ) (NJ + N2) = 0 or (N - N*)N* = 0.

Tr(N) = n -

Tr(N*) = Tr(N;) + Tr(ND = («,

k) + { - k) = n - 2k

Hence Tr(N - N*) = Tr(N) - Tr(N*) = k. The rest follows from the relationship between idempotent matrices and x-distribution.



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [ 67 ] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]