back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [ 23 ] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]


23

Smoothed Moving Averages

Figure 4-7 relates the fully calculated exponential smoothing (within l-o) to the standard moving average. Find the smoothing constant on the left and the number of dajs in a standard moving average along the bottom. Observe that, if you perform an optimization with equally spaced exponential smoothing constants, there is an unexpected distribution relative to past days.

Equoting Standard Moving Averages to Exponential Smoothing

Smoothing constant (%) 10 .20 .30 40 .50 ,60 .70 .80 .90

Standard (nay average) 20 10 6 4 3 2.25 1.75 1.40 1.15

if the equally distributed smoothing constants shown above are used for testing, half of the tests will analyze moving averages of 3 dajs or less. If the testing program is to relate to the standard moving average, reverse the process, finding the smoothing constant for a ioiown



1111111

.......it

......iH

......H

I 111 =

.......ii

.......«i

.....tm

.....

iiišSe5

» - iinmm

FIGURE 4-7 Evaluation of a-ponential smoothing.

<

DATS

number of daj-s. This process will be important when finding robust sjstem parameters and will be discussed in Chapter 21 ("Testing and Optimization.



Bquating Exponemial Smoothing to Standard MovingAvero

Standard {n average) Smoothing constant {%)

.65 40 30 .235

.165 .14 .125

The distribution of smoothing constants is very close to logarithmic and is plotted on a log scale in Figure 4-6. To test exponential smoothing within a range of dajs, use a log. arithmic distribution of smoothing constants across that range, with closer values tal;en at the smaller numbers, if necessary. This may seem an unnecessarj precaution, but it is

TABLE 4-e Cor

Mming

Avtmge Isi-Onfer

2n*Orter

Best Approximation of Smoothing Constants

The standard conversion from the number of dajs to a smoothing constants was given by Hutson as

where n is the equivalent number of days in the standard (linearly weighted) moving average In addition 2nd-or 3rd-order exponential smoothing, based on the weighting of the past 2 or 3 days pnces, may be desirable. This is the exponential equivaletu to step weighi-ing. Its general form is

A comparison of the standard moving average days with 1st-, 2nd-, and 3rd-order exponential smoothing is shown in Table 4-8.

Estimating Residual Impact

The primarj difference between the standard moving average and exponential smoothing is that prices impaa the exponentially smoothed value indefinitely. For practical piuposes, the effect of the oldest data may be limited. A general method of approximating the smoothing constant s for a given level of residual impact is given by

s=l-Rl

*Jack K. Hutson, Filter Price Data; Moving Averages vs. I Stocks & CommodUies (May,Oune 1984).

I Averages," Tecbnical Analysis of



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [ 23 ] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]