back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [ 47 ] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


47

4.2 A Model with Two Explanatory Variables

Consider the model

y, = a + , :, + + «/ I = 1, 2,. . . , n (4.1)

The assumptions we have made about the error term imply that

Eiu) = 0 cov(x,. m) = 0 cov(x2, «) = 0

As in the case of the simple regression model discussed in Section 3.3, we can replace these assumptions by their sample counterparts.

Let a, 3,, and p, be the estimators of a, p,, and 2. respectively. The sample counterpart of u, is the residual

, = ,- - p,x,, - 2 2,

The three equations to determine a, p,, and p, are obtained by replacing the population assumptions by their sample counterparts:

Population Assumption

Sample Counterpart

E{u) = 0

(l/n) 2 , = 0 or 2 «,

cov(m, x) = 0

(l/n) 2 *iA = 0 or 2 Xifi,

cov (u, x2) = 0

(l/n) 2 2, , = 0 or 2 2, ,

Aiicac ( \aii ais\j uc uuiaiiicu uy iiic

and are referred to as the "normal equations.

cient between x, and Xj is +1 or - I. In our analysis in this chapter we rule out perfect collinearity but not the case where the correlation between the variables is high but not perfect. When it comes to the analysis of the effect of several variables x,, x2, . . . , on y, we have to distinguish between joint effects and partial effects. For instance, suppose that we are estimating the effect of price and income on quantity demanded; then we have to consider the joint effect of income and price and the partial effects:

1. Effect of price on quantity demanded holding income constant.

2. Effect of income on quantity demanded holding price constant.

These are the problems that we will be dealing with in this chapter. If price and income are highly correlated with each other, it is intuitively clear that it would be difficult to disentangle the separate effects of the two variables.

We start our analysis with the case of two explanatory variables and then present the formulas for the case of explanatory variables.



cQ ca

= 0 S 2(y, - a - 0,x„ - 02X2,)(-1) = 0 (4.2)

= 0 2 2(y, - a - 0,x„ - 02X2,)(-x,,) = 0 (4.3)

= 0 2 2(y< - a - 0,x„ - 02X2,)(-X2,) = 0 (4.4)

These three equations, as mentioned earlier, are called the "normal equations." They can be simplified as follows. Equation (4.2) can be written as

E / = «a + 3i E -i- + 02 E

where

= a -1- 0,i, -1- 022 (4.5)

n n n

Equation (4.3) can be written as

E 1( / = a E Xu + 0, 2 xl + 02 S ir«:2/ Substituting the value of a from (4.5) into this equation, we get

2 - ; = nXiiy - 0,X, - 022) + 0, E x], + 02 E >lr2, (4.6)

We can simplify this equation by the use of the following notation. Let us define Su = 24- nx] 5,, = 2 XuVi -

Sl2 = 2 XiiXii - 2 2 Siy = 2 -»2iy.- - «•?2

22 = 2 xli - nx\ = 2 y? - «y

Equation (4.6) can be written as

Sly = 0,5„ 025,2 (4.7)

By a similar simplification, equation (4.4) can be written as

52, = 0,5,2 + 02522 (4.8)

Now we can solve these two equations to get 0, and 02. We get

The Least Squares Method

The least squares method says that we should choose the estimators a, 0,, 02 of a, 2 so as to minimize

Differentiate Q with respect to a, 0), and 02 and equate the derivatives to zero. We get



5,

The analogous expressions in multiple regression are

RSS = Syy - Pi5ij, - iSzy regression sum of squares = 3,5, + -

12 is called the coefficient of multiple determination and its positive square root is called the multiple correlation coefficient. The first subscript is the explained variable. The subscripts after the dot are the explanatory variables. To avoid cumbersome notation we have written 12 instead of x,X2. Since it is only xs that have subscripts, there is no confusion in this notation.

The procedure in the case of three explanatory variables is analogous. The normal equations give

a = - Pi-fi - -

5„ = p,5„ + 025.2 + 5,

52, = 0,5,2 + 02522 + 0 523

53, = 0,5,3 + 02523 + 0 5

h = "2, - S,2S,, 4 5

where = 5„522 - S12. Once we obtain p, and p2 we can get a from equation (4.5). We have

a = - i-fi -Thus the computational procedure is as follows:

1. Obtain all the means: y, X2.

2. Obtain all the sums of squares and sums of products: S xi, xl, 2 Xipc2„ and so on.

3. Obtain 5„, 5,2, 522, Sy, Siy, and Syy.

4. Solve equations (4.7) and (4.8) to get , and 02-

5. Substitute these in (4.5) to get a.

In the case of simple regression we also defined the following: residual sum of squares = 5 - P5,y regression sum of squares = 05,



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [ 47 ] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]