back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [ 162 ] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


162

12.3 PROBLEMS WITH LEAST SQUARES RESIDUALS 479

that is, the variance of the current error term u, is higher if the past error term is higher. An unusually high disturbance term in one period results in an increase in uncertainty for the next period. If = 0, there is no ARCH effect and the usual methods of estimation apply. If 0, we have to use more complicated maximum likelihood procedures to estimate the model. A test for ARCH effect using the least squares residuals proceeds as follows:

1. Regress y, on x,. Obtain ,.

2. Regress uj on uj , and test whether the regression coefficient is zero.

A large number of studies, particularly those of speculative prices, have reported significant ARCH effects. However, one has to be careful in interpreting the results because ? , might be acting as a proxy for omitted lagged values of y, and X, from the equation. (Note that , , = y, i - fix, Thus the ARCH test should be performed after including a sufficient number of lagged values of y, and JC, in the equation.

12.3 Problems with

Least Squares Residuals

In our discussion of heteroskedasticity in Chapter 5 and autocorrelation in Chapter 6, we considered only the least squares residuals u, obtained from the least squares regression. The problem with these residuals, as we will demonstrate, is that they are heteroskedastic and autocorrelated even if the true errors have a common variance and are serially independent. This heteroskedasticity and autocorrelation depends on the particular values of the explanatory variables in the sample. That is why the Durbin-Watson test gives an inconclusive region. Durbin and Watson obtained lower and upper bounds for their test statistic that are valid irrespective of what values the explanatory variables take. Presumably, the bounds can be improved with a knowledge of the explanatory variables. In fact, we argued that with most economic variables, the upper limit is the one to use. An altei native solution is to construct residuals that have the same properties as the true errors. That is, if the errors have mean zero and constant variance a and are serially independent, the residuals also exhibit these same properties. One such set of residuals is the "recursive residuals" which are discussed in Section 12.4. First we demonstrate the problems with the least squares residuals.

During recent years separate books have been written on just the topic of residuals. This is because an analysis of residuals is very important for all diagnostic tests.

The point is similar to the one made in Section 6.9, where it was argued that sometimes the observed serial correlation in the residuals could be a consequence of misspecified dynamics (i.e., omission of lagged values of y, and x, from the equation).

For instance, C. Dubbelman, Disturbances in tlie Linear Model: Estimation and Hypothesis Testing (The Hague: Martinus Nihjoff, 1978), and R. D. Cook and S. Weisberg, Residuals and Influence in Regression (London: Chapman & Hal). 1982).



Note that

1.4 = xj

(12.1)

= f = „ (12.2)

From (12.1) we get (since the y, are independent) var(«,) = (1 - /r„)V + 2

= (1 - 2/ „ + ?,)( 2 + 2 / ?< 2

= (1 - lh„W + S ?< 2

Hence, using (12.2) we have

See equation (4 19) We do not have X = 0 because we do not have a constant term in the regression equation

*The quantities /j,, play an important role in the analysis of residuals The matrix of values „ is called a "hat matrix" because of the relationship v = / ,, + h,-,y, + + h,„y„

The least squares residuals are obtained from the least squares regression. For simplicity of exposition let us assume that there is only one explanatory variable and no constant term, so that the regression equation is

y, = px, + u,

The least squares residuals are

, = ,~ px, where p = --

These residuals satisfy the relationship

S , , = 0 Defining S = 21 x], we can write , as , = ,- px,

= y, - (-lyi + 2 2 + • • • + x„y„)x,

Let us define h, = x,xJS. Thus h,j = ,,. Then we get *

«, = y, - ( ,1 1 + + • • • + / ,„ „) = (1 - / „) , + S 4>j



var(M,) = (1 - h„)a = - (12.3)

This shows that the least squares residuals are heteroskedastic. The variance of m, depends on the particular values of the explanatory variable jc. Also, from (12.1) it is easy to see that the least squares residuals are correlated (because of the occurrence of the common ys in the different u,). Thus there are two problems with the least squares residuals:

1. They are correlated.

2. They are heteroskedastic.

This is so even if the true errors u, are uncorrelated and have a common variance cr. The other residuals we will talk about are designed to solve these problems.

12.4 Some Other Types of Residuals

We will discuss four other types of residuals:

1. Predicted residuals.

2. Studentized residuals.

3. BLUS residuals.

4. Recursive residuals.

The predicted and studentized residuals both have the same problems as the least squares residuals. However, some statisticians have found the predicted and studentized residuals useful in choosing between different regression models and detection of outUers, respectively. Hence we will discuss them briefly. The BLUS and recursive residuals both have the property that they have mean zero and constant variance cr- and are serially independent. Thus they solve the problems of least squares residuals. However, the BLUS residuals are more difficult to compute and have been found to be less useful than recursive residuals. Hence we will discuss them only briefly.

Predicted Residuals

Suppose that we take sample data of n observations and estimate the regression equation with (« - 1) observations at a time by omitting one observation and then use this estimated equation to predict the y-value for the omitted observation. Let us denote the prediction error by * = , - y(i). The , are the predicted residuals.

By y(/) we mean a prediction of y, from a regression equation that is estimated from all observations except the /th observation. This is in contrast to y„ which is the predicted value of y, from a regression equation that is estimated using all the observations.



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [ 162 ] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]