back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [ 109 ] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


109

The special thing about the system of equations (8.9) is the symmetry in the (3 coefficients. We have

dx2 dXf

= .2

dxj dxi

dpi dpi

dx, dx-,

= P23

Thus there are cross-equation restrictions on the coefficients. If we assume that V(m,) = V(m,) = we can minimize (2 "1 + 2 "2 + 2 " ) obtain the

normal equations, and estimate the regression coefficients. This is the method used by Waugh. This method involves working out the necessary algebraic expressions and programming things afresh. Instead, we can use the standard regression programs by using the dummy variable method. We can write equations (8.9) as a single equation:

We ran this equation with 45 observations and the 12 dummies (no constant term). The values of the dummy variables are easily generated; for example, the set of observations for p,, consists of the 15 observations of Xj followed by the 15 observations of x, and 15 zeros. The results (with / ratios in the parentheses) are

118.98

- 1.534 :,

- 0.4742

- 0.445X3

-f- 0.0650

(12 00)

(14 55)

(4 31)

(3 01)

(12 61)

P2 =

149.79

- 0.474x,

- 1.1892

- 0.319x3

-1- 0.0162

(9.18)

(4 31)

(6 20)

0 54)

(2 83)

=

131.06

- 0.445x,

- 0.319x2

- 2.389x3

-f- 0.0199

(7 3«)

0 01)

(1 54)

(4 32)

(166)

One can raise questions about the appropriateness of the specification of the system of demand functions (8.9). Our purpose here has been merely to illustrate the use of dummy variable methods to estimate equations where some parameters in different equations are the same.

The results are almost the same as those obtained by Waugh. Part of the difference could be that our program is in double precision.



D, =

1 for period 2 0 for period 1

£) =12 i.e., the corresponding value of X for observations in period 2 [0 for all observations in period 1

Z2 i.e., the corresponding value of z for all observations in period 2 0 for all observations in period I

The unrestricted residual sum of squares is the one from estimating (8.10). As for the restricted residual sum of squares, it is obtained by deleting the dummy variables corresponding to that hypothesis.

Hypothesis

Variables Deleted

All coefficients same

F>i. A. A

a, = a2, p, = P2, -yi =

Only intercepts change

D2,Di

P, = P2, =

Only intercepts and

Z>2

coefficients of z change

P. = P2

8.5 Dummy Variables for Testing Stability of Regression Coefficients

Dummy variables can also be used to test for stability of regression coefficients as discussed in Section 4.11. The definition of the appropriate dummy variables depends on whether we are using the analysis of covariance test or the predictive test for stability. We will first discuss the analysis-of-covariance test. Consider, for instance, the two equations

y, = ttj p,x, + yiZi + M for the first period

2 = «2 + + 2 + "2 for the second period

We may be interested in testing the hypothesis that none of the coefficients changed between the two time periods; only the intercept coefficients changed, or only the intercept and the coefficient of the jc-variable changed, and so on. As discussed in Section 4.11, the analysis-of-variance test depends on obtaining the unrestricted and restricted residual sums of squares. Both these residual sums of squares can be obtained from the same dummy variable regression if we define enough dummy variables. For instance, we can write the equations for the two periods as

= a, + (a2 - a,)D, + , + (P2 - P,)D2 + + ( - ) + (8.10)

Note that we write the equation in terms of differences in the parameters and define the dummy variables accordingly.



RSS,/(«, ~ - I)

as an F-variate with degrees of freedom «2 and «, - - 1. Here RSS is the residual sum of squares with ( , - «2) observations and RSS, is the residual sum of squares with , observations.

The F-test, however, does not tell which of the «2 observations contribute to the instability of the coefficients or are outliers. To do this, we can define a set of «2 dummy variables defined as

1 for observation + i i - \ 2 n

0 for other observations ~ > • • • "2

and test whether the coefficients of the dummy variables are zero. Since one can get the standard error for each of these dummy variables separately from the standard regression packages, one can easily check for outliers and see which of the observations are significantly outside the regression line estimated from the first n, observations.

The common regression parameters will be estimated from the first , observations, the coefficient of the /th dummy variable for / = , -)- 1, . . . , «2 will measure the prediction error for the prediction of this observation based on the coefficients estimated from the first «, observations, and the standard error of this coefficient will measure the standard error of this prediction error.

Consider, for instance.

a -(- PX, -I- P22 + " for the first n, observation

a + PX, -I- P2X2 + y, + for the ( , + l)th observation

a + p,jc, -I- 2 2 + 2 + for the (n, + 2)th observation

D. Gujarati, "Use of Dummy Variables in Testing for Equality of Sets of Coefficients in Two Linear Regressions: A Note," American Statistician. February 1970. Gujarati argues that the Chow test might reject the hypothesis of stability but not tell us which particular coefficients are unstable, whereas the dummy variable method gives this information. D. S. Salkever, "The Use of Dummy Variables to Compute Predictions. Prediction Errors and Confidence Intervals," Journal of Econometrics, Vol. 4, 1976, pp. 393-397.

There are some who argue in favor of estimating equations like (8.10) and checking which of the dummy variables is significant in preference to the Chow test discussed in Chapter 4.* However, we should be cautious in making inferences about stability and instability of the coefficients by looking at the r-ratios of the dummy variables alone. As we pointed out in our discussion of (Section 4.10) it is possible that the ratios for each of a set of coefficients are all insignificant and still the F-ratio for the entire set of coefficients is significant. What one should do in any particular example is to use the F-tests and then use the Mests on individual dummy variables only if they correspond to economically meaningful hypotheses.

As discussed in Section 4.11, the analysis-of-covariance test cannot be used if «2 < k. In this case the predictive test suggested by Chow is to use

(RSS - RSS,)/«2



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [ 109 ] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]