back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [ 27 ] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134]


27

applied simple and (occasionally) iterated EMA operators to homogeneous time series for a long time. Iterated EMA operators for inhomogeneous time series were first explored by Miiller (1991) and systematically developed and discussed by Zumbach and Miiller (2001). A simple recursive definition is

with EMA[r, 1; z] = EMA[r; ]. This definition can be efficiently evaluated by using the iterative formula in Equation 3.51 for all its basic EMAs. There is one subtle point related to the choice of the interpolation scheme in Equation 3.52. The EMA of necessarily has an interpolation scheme different from that used for z. The correct form of EMA[r; z] between two points is no longer a straight line but a nonlinear (exponential) curve. (Theoretically, it is straightforward to derive the corresponding exact interpolation formula.) When using one of the interpolation schemes of Equation 3.52 after the first iteration, we are making a small error. Yet if the kernel is wide as compared to tn - tn-\, this error is indeed very small. As a suitable approximation, we recommend using linear interpolation in the second and all further EMA iterations, even if the first iteration was based on the next-point interpolation. The only exception occurs if zn is not yet known; then we need a causal operator based on the previous-point interpolation. The kernel of EMA[r, n] is

This family of functions is related to Laguerre polynomials, which are orthogonal with respect to the measure e~r (for = 1). Through an expansion in Laguerre polynomials, any kernel can be expressed as a sum of iterated EMA kernels. Therefore, the convolution with an arbitrary kernel can be evaluated by iterated exponential moving averages. Yet the convergence of this expansion may be slow, namely high-order iterated EMAs may be necessary, possibly with very large coefficients. This typically happens if one tries to construct operators that have a decay other (faster) than exponential. Therefore, in practice, we construct operators empirically from a few low-order EMAs, in a way to minimize the build-up time. The set of operators provided by Section 3.3 covers a wide range of computations needed in finance. The range, width, and aspect ratio of the iterated EMA are

EMA[r, n\ z\ = EMA[r; [ , n - 1; ]]

(3.53)

(3.54)

n{n + 1)t2

(3.55)

vV + \)/n

The iterated EMA[r, n] operators with large n have a shorter, more compact kernel and require a shorter build-up time interval than a simple EMA of the same range



0 t

FIGURE 3.5 ma[r,nl(f) for n = 1,2,4, 8, and i6,forr = l

.r. This is indicated by the aspect ratio AR, which decreases toward 1 for large . Each basic EMA operator that is part of the iterated EMA has a range r, which s much shorter than the range nx of the full kernel. Even if the tail of the kernel s still exponential, it decays more quickly due to the small basic EMA range .

To further extend our computational toolbox, we build another type of compact :ernel by combining iterated EMAs, as shown in the next section. As the iterated iMAs, these combined iterated EMAs have a shorter build-up time interval than simple EMA of the same range.

-3.7 Moving Average (MA)

a very convenient moving average is provided by

"he parameter x is chosen so that the range of MA[r, n] is R = x, independently >r n. This provides a family of more rectangular-shaped kernels, with the relative weight of the distant past controlled by n. Kernels for different values of n and - = 1 are shown in Figure 3.5. Their analytical form is given by

with x =

n + 1

(3.56)

(3.57)



0.01

0 . 001

FIGURE 3.6 ma[r, n\(t) for n = 1, 2, 4, 8, and 16, for = 1, on a logarithmic scale.

For n - oo, the sum corresponds to the Taylor expansion of exp( r), which cancels the term exp(-t/x), making the kernel constant. For finite n, when t/z is small enough, the finite sum will be a good approximation of exp(t/r). Small enough means that the largest term in the sum is of order one: (t/x)n/n\ ~ 1. For large n, the condition {t/x)n/n\ ~ 1 corresponds to t ~ 2 (using Stirlings approximation n\ ~ n"). Therefore, for t <JC 2 ,the series approximates well the Taylor expansion of an exponential

"I , / .

n + 1

This explains the constant behavior of the kernel for t < 2 . For t > 2 large, the exponential always dominates and the kernel decays to zero. Therefore, for large n, this operator tends to a rectangular moving average for which AR = 2/s/b. For n values of n ~ 5 and higher, the kernel is rectangular-like more than EMA-like; this can be seen in Figure 3.5. These rectangular-like kernels are preferred to the rectangular kernel itself because they fade smoothly rather than abruptly. Abrupt "forgetting" of past events leads to superfluous noise in the operator results.

The decay of MA kernels is also shown in Figure 3.6. The aspect ratio of the MA operator is

AR =

4(n + 2)

3(n + l)

Clearly, the larger n, the shorter the build-up.

(3.58)



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [ 27 ] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134]