back start next[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [ 31 ] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] 31 I 1 1 l 1 1 l 1 1 1 l 1 1 I 1 1 1 l 1 1 1 I 1 11 I 26.10 27.10 28.10 29.10 30.10 31.10 1.11 2.11Dateu--1-1-1 i . I--1-1-1--i-i-1-1-i-i-i-i-i-i-i-r-i-i-i I26.10 27.10 28.10 29.10 30.10 31.10 1.11 2.11DateFIGURE 3.13 The standardized return, moving skewness, and moving kurtosis. The returns are computed as r = D[x = 15 min; x] and standardized with t\ = x2 = 24hr.Mr , f. . , [( - [ ])( - [ ])] MMCorrelationi r; y, z] = - (3.7 ) i MSD[y] MSD[z]MCorrelation2[r; y, z] = MA( - MA[y]) (z - )MSD[y] MSD[z] - MAf z ] (3.72)where all of the MA and MSD9 operators on the right-hand sides are taken with the same decay constant r. These definitions are not equivalent because the MSD operators in the denominator are time series that do not commute with the MA operators. Yet both definitions have their respective advantages. The first definition obeys the inequality - 1 < MCorrelationi < 1. This can be proven by noting that MA[z2]{t) for a given t provides a norm on the space of (finite) time series up to t. It happens because the MA operator has a strictly positive kernel that acts as a metric on the space of time series. In this space, the triangle inequality holds y/MA[(y + z)2] < y7MA[y2] + tJmA[z2], and, by a standard argument, the inequality on the correlation follows. With the second definition (Equation 3.72), the correlation matrix is bilinear for the standardized time series. Therefore, the rotation that diagonalizes the correlation matrix acts linearly in the space of standardized time series. This property is necessary for multivariate analysis, when a principal component decomposition is used. In risk management, the correlation of two time series of returns, x and y, is usually computed without subtracting the sample means of x and y. This implies a variation of Equations 3.71 and 3.72MA[yZ]MCorrelation [r; y, z] =-- (3.73)1 MNormy MNormfzMCorrelation2[r; y, z] = MAMNorm[yl MNorm[z] (3.74)where again the same is chosen for all MA operators. In general, any reasonable definition of a moving correlation must obeylim MCorrelationlr; y, z] -* p[y, z\ (3.75)T -> oowhere p[y, z\ is the theoretical correlation of the two stationary processes x and y. Generalizing the definition (Equation 3.72), the requirements for the correlation kernel are to construct a causal, time translation invariant, and a linear operator for and z. This leads to the most general representation/•oo /" MCorrelationLy, zl(0 = / / dt dt" c(t, t") y(t - t) z(t - t") (3.76) Jo Jo9 See Equation 3.60.We also require symmetry between the arguments where MCorrelation[z, y] = MCorrelation[y, ]. Moreover, the correlation must be a generalized average, namely MCorrelationfConst, Const] = Const Const, or, formulated for the kernel, / /0°° dt dt" c(t, t") - 1. There is a large choice of possible kernels that obey these requirements. For example, Equation 3.72 is equivalent to the kernel c((,(")=5(/-r")ma().3.3.14 Windowed Fourier TransformIn order to study a time series and its volatility at different time scales, we want to have a tool similar to a wavelet transform,10 which adapts to causal signals. The motivation is to reveal structures of price movements related to certain frequencies. Similar to wavelet transforms, we want a double representation in time and frequency, but we do not require an invertible transformation because our aim is to analyze rather than further process the signal. This gives us more flexibility in the choice of the transformations. A simple causal kernel with such properties is like ma[r](f) sin(£?/r), where ma[r (/) is still the MA kernel of Equation 3.57. Essentially, the sine part is (locally) analyzing the signal at a frequency k/x and the MA part is taking a causal window of range r. As we want a couple of oscillations in the window 2r, we choose between ~ it and ~ 5 . Larger values increase the frequency resolution at the cost of the time resolution. The basic idea is to compute an EMA with a complex r; this is equivalent to including a sine and cosine part in the kernel. The nice computational iterative property of the moving average is preserved. The first step is to study complex iterated EMAs. The kernel of the complex ema is defined ase-f 1ema[£](r) =- where i; = -(\+ik) (3.77)where £ is complex but r is again a real number. The choice of the normalization factor 1 / is somewhat arbitrary (a factor £ will produce the same normalization for the real case = 0) but leads to a convenient definition of the windowed Fourier kernel that follows. By using the convolution formula, one can prove iteratively that the kernel of the complex EMA[£, n] is given byema[f, n](t) =--- - - (3.78)(n - 1)! \xJ xwhich is analogous to Equation 3.54. The normalization is such that, for a constant function c(t) = c,EMA[£,n;c =--- (3.79)An introduction to wavelet methods is studied extensively in Gencay et al. (2001b).[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [ 31 ] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134]