back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [ 49 ] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


49

Illustrative Example

A production function is specified as

y, = a -1- 3,x„ + 2 2, + u, u,~ IN(0, CT)

where

2. Note that 5(1 - r) is the residual sum of squares from a regression of :, on X2. Similarly, Siii 1 - \ ) is the residual sum of squares from a regression of Xj on x,. We can now see the analogy between the expressions for the variances in the case of simple regression var(p,) = ctVRSS,, where RSS, is the residual sum of squares after regressing jc, on the other variable, that is, after removing the effect of the other variable. This result generalizes to the case of several explanatory variables. In that case RSS, is the residual sum of squares from a regression of x, on all the other xs.

Analogous to the other results in the case of simple regression, we have the following results:

3. If RSS is the residual sum of squares then RSS/cr has a x-distribution with degrees of freedom (n - 3). This result can be used to make confidence interval statements about .

4. If CT = RSS/(« - 3), then E(a-) - or is an unbiased estimator for

5. If we substitute & for ct- in the expressions in result 2, we get the estimated variances and covariances. The square roots of the estimated variances are called the standard errors (to be denoted by SE). Then

g - g Pi - Pi P2 - P2 SE(a) SE(3,) SES(P2)

each has a /-distribution with degrees of freedom (n - 3).

In addition to results 3 to 5, which have counterparts in the case of simple regression, we have one extra item in the case of multiple regression, that of confidence regions and joint tests for parameters. We have the following results.

F = - + 25,2(3, - Pi)(P2 - 2) + S222 - 2 has an

f-distribution with degrees of freedom 2 and (n - 3). This result can be used to construct a confidence region for 1 and 2 together and to test 3, and 2 together.

We shall state later results 1 to 6 for the general case of k explanatory variables. But first we will consider an illustrative example.



=

Su =

S22 =

Sb =

(a) Compute d, and 2 and their standard errors. Present the regression equation.

(b) Find the 95% confidence intervals for a, (3,, (Bj, and and test the hypotheses Pi = 0 and 2 = 0 separately at the 5% significance level.

(c) Find a 95% confidence region for Pi and P2 and show it in a figure.

(d) Test the hypothesis Pi = 1, P2 = 0 at the 5% significance level.

Solution

(a) The normal equations are

12 , -1- 8p2 - 10 8p, -1- 12p2 = 8

These give p, = 0.7 and P2 = 0.2. Hence d = - p,jc, - 2X2 = 12 -0.7(10) - 0.2(5) = 4.

02 Pl-ry + iSzy

0.7(10) -1- 0.2(8)

= .86

The residua] sum of squares RSS = 5,,(1 - R) = 10(1 - 0.86) = 1.4. Hence

d = = = 0.07

,12 =

« - 3

5„522

Hence we have

5i,(l - rij) = 12

Hence

/80\ 80 20

144/ 12 3

= iog output

Xl = log labor input

2 = log capital input

The variables x, are nonstochastic. The foUowing data are obtained from a sample of size M = 23 (23 individual firms):



cov(3„ p,) = "

-ct2(64/144)

8(80)/144) 10

Also, since jc, = 10 and X2 = 5, we have

V(&) = a2

1 . .,,2/3 2(10X5) , (5)2(3)

23 «4) " 20

= 8.7935ct2

Substituting the estimate of , which is 0.07, in these expressions and taking the square roots, we get

SE(3,) = SE(P2) = = 0.102

SE(a) = 0.78 Thus the regression equation is

;p = 4.0 + 0.7x, + 0.2x2 /?2 = 0.86

(0 78) (0 102) (0 102)

Figures in parentheses are standard errors.

(b) Using the l-distribution with 20 d.f., we get the 95% confidence intervals for a, Pi, and P2 as

d ± 2.086SE(d) = 4.0 ± 1.63 = (2.37, 5.63)

P, ± 2.086SE(3,) = 0.7 ± 0.21 = (0.49, 0.91)

p2 ± 2.086SE(p2) = 0.2 ± 0.21 = (-0.01,0.41)

The hypothesis p, = 1.0 will be rejected at the 5% significance level since p, = 1.0 is outside the 95% confidence interval for p,. The hypothesis P2 ~ 0 will not be rejected because p, = 0 is a point in the 95% confidence interval for P2. Using the distribution for 20 d.f., we have

ProbI 9.59 < -?- < 34.2

= 0.95

var(P2) = 2



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [ 49 ] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]