back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [ 17 ] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]


17

dence band with the regression analjsis looks remaitably similar to a channel on a standard bar chart.

Hochheimers 1982 study, performed without a channel or confidence band, shows the best selection of the regression interval at 60 to 70 dajs (the maximum tested was 70 dajs). Silver had a remarkable 1,615 trades-one each day. Onlj two maitets showed average trade duration greater than 2 days. We can draw the conclusion that successful regression based sjstems seek out longer time periods.



Trend Calculations

The of all trend identification methods is to see past the underljing noise in the maitet, those erratic moves that seem to be meaningless, and find the current direction of prices. Because there may be more than one trend at any one time, caused by short-term events and long-term policy, it is possible to search for th e strongest or mosl dominant trend, or a minor trend that corresponds to your expected time frame. The technique that is used to uncover the right frend depends upon whether any of the frend characteristics are known. Does it have a seasonal or cyclic component, is it based on long-term mondarj policy, or is it an ovemight effect? The more you know about the reasons why prices frend, the better you will be able to find the most reliable calculation for separating the price direction from maitet noise.

Chapter 3, "Regression Analjsis," produced forecasts by finding relationships between two or more price series. Once you know that there is a fimdamental relationship between data, based on measuring the properties of dependence and correlation, a formula can be found that expresses one price movement in terms of the other prices and data. The predictive qualities of these methods are best when applied to data that has been seen before, that is, prices that are within the range of historic data. Forecasting reliability decreases sharply when values are based on extrapolation outside the previous occurrences. This phenomenon will also be true of other frending methods. Because of the way we test and define the final frend calculation, it is based on the movement of historical data: when prices move to new levels, the results of the model will often deteriorate.

FORECASTING AND FOLLOWING

There IS a tlear distinction between forecasting the trend and determining the current trend Forecasting, predicting the future price, is much more desirable but complex As shown in the previous chapter, it may Involve combining pieces of data and result in a forecasted price with a confidence level "Ihe further you 1 -to forecast mto the future, the lower the confidence.

The techniques most cummi>n1y used for evaluating the direction or tendeno of prices both within prior ranges or at new levels are called auloregressne functions I nlikif forecasting models, they are only concerned with evaluating the current price direction Thts anahSfs normally used to conclude that prices are moMtig in an up«-ard. downward, or sideways direction. From this simple building block, it viill be possible to form rules of action and develop complex siraiegic-s of anticipation From a practical viexvpomt these trending methods are more flexible than the traditional regression models, but to achieve success they introduce a lag. A great effort has been spent tr\ ing to reduce this lag

In an autoregressive model, one or more of the previous days prices determine the next sequential price If P, represents today s price. P, , yesterday s. and so on. then tomorrows expected price will be

I =e,i + «?,/*, I +----+e



where each price is given a corresponding weighting a, and combined lo give the resultant price for tomorrow, * e (where e represents an error factor). The simplest example is the use of yesterdays price alone to generate tomorrows price:

which you may also recognize as the formula for a straight line. ) - a +

The autoregressive model does not have to be linear: each prior da>- can haw a nonlinear predictive quality. Thus each expected price P,., cxjuld be represented a curvilinear expression, , + , , + \., + , or by an exponential oi logarithmic formula. InP,,, - 0 ««, InP, + In P,,, + e, which is commonly used in equiti" ana]>-sis. All of these expressions could then be combined to form an autoregrvssive forecasting model for P,., In going from the simple to the complex, it is natural to want to know which of these choices will perform best. The answer can only be found by application to a specific problem, validation, and experience. Various methods, based on a sound premise, must be

attempted an " plied---ctual data in real-time or extrapolated situations to determine

die predi. ve qual an model Chapters 5 and 15 will discuss nhich choices have been most popular ana Chapter 21 will show testmg methods that are most likely to lead to robust results.

LEAST-SQUARES MODEL

The least-squares regression model is the same technique that was used in the previous chapter to find the relatlonsh between two dependent markets, com and soybeans, or to find how prices moved when driven by known related factors such as simply and demand. Here, the least-squares model will be used to find the relationship between time and price, rather than between two prices, where the price forecast that we are seeking is dependent upon time. The regression model will also be applied in an autoregressive way by recalculating the expected price daily and using the slope of the resulting straight line or curvilinear fit to determine the direction of the trend.

A simple error analjsis can be used to evaluate the predictive qualities of this method. Assume that there is a lengthy price series for a maitet and that we would like to know how many prior dajs are optimum for predicting the nexl days price. The answer is found by looking at the average error in the predictions. If the number of dajs in the calculation increases and if the predictive error decreases, the answer is improving; if the error stops decreasing, the accuracy limit has been reached. Error analjsis can improve most trend calculations,and it is also covered in the section on exponential smoothing later in this chapter. As an example, start by using only one prior day to find the price forecast

P,-.=ao + a,P, .

and work up to a large number of days-

P,,,=ti„-Fa,P, + ti2l,-, + +«„P,-„.,

In the last case, it takes w days of prior prices to generate each new prediction. In all cases,

+ 2 equations can be written to solve the ri + I coefficients a , tij.....a„. using the

matrix elimination method found in Appendix 3 The result is a predicted price P, for each actual price f We will use the notation \ to mean the predicted price for day / using an n day linear regression; therefore R 3) - 58 OO mean that the predicted value of R on day 25 was 58.00 using a 3-day linear regression analysis (a straight-line fit of the 3 prior days). The error occurring in each prediction is defined as

the difference between the actual and predicted values for that day using the n-day linear regression. As an of



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [ 17 ] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]