back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [ 36 ] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


36

1980. Thus the solution to this problem is not a deletion of observations as in the previous example of the consumption function but a reformulation of the problem taking account of labor force participation rates, time lags, and so on. This analysis is a lot more complicated for our purpose here. The example serves to illustrate the point that all "outliers" should not be deleted.

3.9 Alternative Functional Forms for Regression Equations

We saw earlier with reference to Figure 3.6(ii) that sometimes the relationship between and x can be nonlinear rather than linear. In this case we have to assume an appropriate functional form for the relationship. There are several functional forms that can be used which after some transformations of the variables, can be brought into the usual linear regression framework that we have been discussing.

For instance, for the data points depicted in Figure 3.7(a), where is increasing more slowly than x, a possible functional form is = a + p log x. This is called a semilog form, since it involves logarithm of only one of the two variables X and y. In this case if we redefine a variable X = log x, the equation becomes = a + X. Thus we have a linear regression model with the explained variable and explanatory variable X = log x.

For the data points depicted in Figure 3.7(b), where is increasing faster than X, a possible functional form is = Ae. In this case we take logs of both sides and get another kind of semilog specification.

logy = logi4 +

If we define F = log and a = log A, we have

F = a +

(a) increasing more slowly than x (b) increasing faster than x

Figure 3.7. Data sets for which linear regression is inappropriate.



In the first case we define X = 1/x and in the second case we define X = l/Vx. In both the cases the equation is linear in the variables after the transformation. It is

which is in the form of a linear regression equation. An alternative model one can use is

= -

In this case taking logs of both sides, we get

log = log A + log X

In this case p can be interpreted as an elasticity. Hence this form is popular in econometric work. This is called a double-log specification since it involves logarithms of both Jt and y. Now define = log y. A = log jc, and a = log A. We have

F = a +

which is in the form of a linear regression equation. An illustrative example is given at the end of this section.

There is. of course, a difference between those functional forms in which we transform the variable x and those in which we transform the variable y. This becomes clear when we introduce the error term u. For instance, when we write the transformed equation with an additive error, which we do before we use the least squares method, that is, we write

F = a + pJSf + w

we are assuming that the original equation in terms of the untransformed variables is

= Ajfie

that is, the error term enters exponentially and in a multiplicative fashion. If we make the assumption of an additive error term in the original equation, that is,

= Ax* +

then there is no way of transforming the variables that would enable us to use the simple methods of estimation described here. Estimation of this model requires nonlinear least squares.

Some other functional forms that are useful when the data points are as shown in Figure 3.8 are

r « a + -



Y = a +

If > 0, the relationship is as shown in Figure 3.8(a). If p < 0, the relationship is as shown in Figure 3.8(b).

Some other nonlinearities can be handled by what is known as "search procedures." For instance, suppose that we have the regression equation

= a +

X +

The estimates of a, p, and are obtained by minimizing

We can reduce this problem to one of the simple least squares as follows: For each value of -y, we define the variable Z, = l/(x, + y) and estimate a and p by minimizing

2 (y, - a - ,)2

We look at the residual sum of the squares in each case and then choose that value of for which the residual sum of squares is minimum. The corresponding estimates of a and p are the least squares estimates of these parameters. Here we are "searching" over different values of y. This search would be a convenient one only if we had some prior notion of the range of this parameter. In any case there are convenient nonlinear regression programs available nowadays. Our purpose here is to show how some problems that do not appear to fall in the framework of simple regression at first sight can be transformed into that framework by a suitable redefinition of the variables.

(a)P>0 (b)P<0 a>0

Figure 3.8. (a) decreases with increasing x; (b) increases with increasing x.



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [ 36 ] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]