back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [ 57 ] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


57

which has a f-distribution with degrees of freedom n - 3.

An alternative procedure is to derive URSS and RRSS and use the F-test. URSS is the residual sum of squares we obtain when we estimate a, p,, and P2. To get RRSS, we have to use the restriction p, -1- P2 - 1 = 0. The way to do this is to eliminate P2 since P2 = 1 - P,. Thus we get the regression equation as

log Z = a -b p, log L -b (1 - Pi) \ogK +

(log X -\ogK) = a + Pidog Z. - log + w

Thus to get RRSS we regress (log X - log K) on (log L - log K). The residual sum of squares from this equation gives us the required RRSS.

The same procedure applies if we have two Hnear restrictions. Consider, for instance, the case of the multiple regression equation

= a, -b p,x, + 2 2 + +

and we need to test the restrictions

P, + p2 + = 1 P2 - 2 = 0

a problem in testing nonnested hypotheses. This problem is treated briefly in Chapter 12.

Tests for Linear Functions of Parameters

We have until now discussed tests for parameters. Very often we need tests for functions of parameters. The functions we need to consider can be linear functions or nonlinear functions. We discuss linear functions first. Suppose that we have estimated a production function in a log-linear form:

log Z = a -b Pi log L + log +

where X is the output, L the labor input, and the capital input. Then a test for constant returns to scale is a test of the hypothesis p, - P2 = 1. We can use a /-test to test this hypothesis as follows. We get the least squares estimates p, and p, and define

crC, = cov(p„ p; iJ = 1,2

Then, under the null hypothesis p, -I- p - 1 = 0, we have the result that P, -b P2 - 1 is normally distributed with mean 0 and variance . + IC + C22).

Since RSS/a has an independent distribution with degrees of freedom (a2 - 3), the f-statistic to test the hypothesis is given by



where

We now have

Hence

= a + P,x + e X = X + 0.6x2

5,, = Su + (0.6)2522 + 2(0.6)5,2

= 12 + (0.6)2(12) + 2(0.6)(8) = 25.92

5 = 5„ + 0.652,

= 10 + 0.6(8) = 14.8

RRSS = 5,, - P5„ = 10 - 8.45 = 1.55 Thus RRSS - URSS = 1.55 - 1.40 = 0.15 and the F-statistic is

0 5 0 5 1.4/20 0.07

which, with d.f. 1 and 20, is not significant at the 5% level.

By the alternative method, we consider P2 - . ,. It is normally distributed with mean zero (under ) and using the expression for we note that its variance is

Note that these restrictions can be written as

= 2 and P, = 1 - Substituting these in the original equation, we get

>- = a + (1 - ) , + 2 :2 + PjX, + e

(y - xi) = a + P3(-3X + 2x2 + ) + e

Thus we get the restricted residual sum of squares RRSS by running a regression of (y - X,) on (- 3x, + 2x2 + with a constant term.

Illustrative Example

Consider again the illustrative example in Section 4.3. Suppose that the problem is to test the hypothesis:

: i = I against ,: f

at the 5% significance level. The hypothesis is P2 = 0.6p,. To compute the restricted estimates, we substitute P2 = 0.6Pi and estimate the equation



4 9 OMISSION OF RELEVANT AND INCLUSION OF IRRELEVANT VARIABLES lg

Since

1 + (0.6).!?- 2,0.6) -1

= 0.324O-2

- 0.6 , = 0.2 - 0.6(0.7) = -0.22 we have the F-statistic

(-0.22)2/0.324 0 5 1.4/20 ~ 0.07

which is the same as the one obtained earlier.

4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables

Until now we have assumed that the multiple regression equation we are estimating includes all the relevant explanatory variables. In practice, this is rarely the case. Sometimes some relevant variables are not included due to oversight or lack of measurements. At other times some irrelevant variables are included. What we would like to know is how our inferences change when these problems are present.

Omission of Relevant Variables

Let us first consider the omission of relevant variables first. Suppose that the true equation is

= , , + + (4.15)

Instead, we omit and estimate the equation

= , , +

This will be referred to as the "misspecified model." The estimate of p, we get is

Substituting the expression for from (4.15) in this, we get

2 j:,(P,x, + 2 2 + ) 2 2 xyu

-T.-""TTTl

Since £(2 xtu) = 0 we get

£(P,) = p, + fc,,P2 (4.16)



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [ 57 ] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]