back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [ 55 ] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


55

4.4 Estimates of Demand for Food Equations"

Equation 1

Equation 2*

1927-41

1948-62

All Obs.

1927-41

1948-62

All Obs.

4.555

5.052

4.050

4.058

16.632

8.029

(22 67)

(5 61)

(29 55)

(0 54)

(061)

(4 47)

p, (price)

-0.235 •

-0.237

-0.120

-0.123

-2.745

-0.996

(-441)

(-1 54)

(-2 95)

(-0 07)

(-0 47)

(-2 51)

P2 (income)

0.243

0.141

0.242

0.368

-2.416

-0.718

(10 63)

(3 03)

(17 85)

(0 20)

(-0 40)

(-166)

(interaction)

-0.028

0.554

0.211

(-0 07)

(0 43)

(2 22)

0.9066

0.8741

0.9731

0.9066

0.8762

0.9775

58.2

41.7

35.6

26.0

374.5

10 X RSS

0.1151

0.0544

0.2866

0.1151

0.0535

0.2412

d.f.

10" X CT

0.9594

0.4534

1.0613

1.0462

0.4866

0.9278

"Figures in parentheses are r-ratios, not standard errors.

"From Equation 2, all the partial r"s are very low for 1927-41 and 1948-62, although the Ws are high. F is an F-statistic for testing overall fit, that is, for R-. This is discussed in Section 4.8. "o-- = RSS/d.f. The importance of this is discussed in Section 4.9.



4.7 Prediction in the Multiple Regression Model

The formulas for prediction in multiple regression are similar to those in the case of simple regression except that to compute the standard error of the predicted value we need the variances and covariances of all the regression coefficients. Again we will present the expression for the standard error in the case of two explanatory variables and then the expression for the general case of explanatory variable. But we do not need to compute this general expression because there is an easier way of generating the standard error which we will describe in Chapter 8 (see section 8.5). Let the estimated regression equation be

= d + , , +

Now consider the prediction of the value of given values of x,, and of X2, respectively. These could be values at some future date. Then we have

Consider

= a + , :, + PjXjo + Mo = d + PiX,o + (§2X20

The prediction error is

- = d - a + (p, - ,) :, + (P2 - 32)20 - «

The interaction term has a very low r-value in the equations for 1927-1942 and 1948-1962 separately. Hence we will consider only the equation for the combined data. For this equation the income coefficient has a low r-value and the sign also appears to be wrong at first sight. However, the equation does give positive income elasticities. The income elasticities are given by

c? log cr

= -0.718 + 0.211 logp

log p ranges from 4.33 to 4.66 in the data. Thus the income elasticity ranges from 0.195 to 0.265. The price elasticity is given by

d log p

Thus, as income increases, demand for food becomes more price inelastic. For the data in Table 4.3, log ranges from 3.78 to 4.66. Thus the price elasticity ranges from -0.198 to -0.013.



0.07(1 + i) . 4

20 20 10/

(0.07) = 0.101

The standard error of the prediction is 0.318. Thus the 95% confidence interval for the prediction is

13.8 ± 2.086(0.318) or 13.8 ± 0.66 or (13.14, 14.46) Comment

In the case of simple regression we said (in Section 3.7) that the variance of the prediction error increases as we increase the distance of the point x from x. In the case of multiple regression we cannot say that the variance of the prediction error increases with the Euclidean distance [( :, - jci) + (xjo ~ - )"- This is because there is the covariance term as well. For instance, in our example let us change :2 to 3. Now 20 - X2 = -2. The Euclidean distance is the same as before. It is V22 -t- (-2)2.However, the variance of the prediction error is now

Since £(« - a), £(p, - p,), 2 - Pj), and { ) are all equal to zero, we have E(% - y) = 0. Thus the predictor y is unbiased. Note that what we are saying is E(So) ~ ) (since both % and are random variables). The variance of the prediction error is

<r(l + + (x,o - x,r var(p,)

+ 2( :, - X,) (20 - X2) cov(p„ P2) + - 2 var(P2) In the case of explanatory variables, this is

aMl +-!-) + X X (,0 - ,)(o - cov(p„ p;

We estimate cr by RSS/(n - 3) in the case of two explanatory variables and by RSS/(n - - 1) in the general case.

Illustrative Example

Again consider the illustrative example in Section 4.3. The regression is

= 4,0 + o.7x, + 0.22 Consider the prediction of for x = 12, and X20 = 7. We have = 4.0 + 0.7(12) + 0.2(7) = 13.8

Note that

x,o - JC] = 12 - 10 = 2 -»2o - -«2 = 7 - 5 = 2

Using the expressions for var(P,), var(P2), cov(p,, P2), and & derived in Section 4.3, we get the estimated variance of the prediction error as



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [ 55 ] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]