back start next
[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [ 12 ] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]
12  (IS8«6513M 73)(222I29 <2 5 92)iii9 47) The results show siruue relaiionship fietween the prices of soyljearis and t oni iTie value of may also be eonsidered as having accounted for »  of Che price relationship lietween ilie two products Computer Programs to Find Correlations Just as with a linear regression, both spreadsheets and trading strategj software have ftmctions to retum the correlation coefficient with no calculations on your part. Spreaddieets include the correlation as part of the statistics given when the regression tool is used. If only the correlation is needed, for example, to create a correlation matrix, then you can use the ftmction r re 1 (c , 12), which calculates only the correlation coefficient for the two series in columns 1 and 2. The same facility exists in trading strategj software, in which you can use a ftmction that specifies the independent series, dependent series, and the time period over which the regression wiU be calculated. For 1 , you might spedfj the correlation, R2, of two series, data 1 and data 2, over the past 45 dajs as: R2 = (acorrelation(close of data 1, close of data 2,45) Correlation Adjustments When Using a Time Series Because most price analjses involve the use of two time series, precautions should be taken to avoid a dominant trend that can distort the results. A longterm upward or down, ward trend wiU overshadow the smaller movements around the trend and exaggerate the correlation. The foUowing methods may be used to correct the problem: 1. The deviations from the trend iyt yd may be correlated. 2. The first differences XiyiyttX (. / / may be correlated. 3. The two series may be adjusted for trend. The simplest method is number 2, which is also very effective. You should expect any mercial software to detrend the data before calculating the correlation coefficient; h ever, it is alwajs a good habit to check it. Forecastmg Usmg Regression A diatinct advantage of regression analjsis is that it allows the analjst to fo movement, in the case of the linear regression, the forecast win simply be an (=e~ion the line. Later in this book, there wiU be other nonlinear solutions that are used to more complex pattems. The regression forecast is the basis for the probabilistic modek Instead of the soybean relationship, regress the price of soybeans against time using the linear squares method on the data in Table 32. The result is = .987 + .221 where is the price of soytjeari» fi>r the year jc Because t uf 1 for 19<i and 27 for the average farm ineonic as 7.32 in 1S»H5 < = 30> and H 3H ii» 1S>**0 fx = ?5> Rcf;rcb.ion analysis includes its own measure of atturaty tailed cfinfiedence bands It l.s based on a probability distribution of the errors In the fitted ecjuation and the size of the data sample Looking al Figure = 3 thtraight Iin «. touch all the point.s hut t.4 goottness of fit may be measured by u.sing the stantiarii deiHation ttf tbe errnrs to determine the variance over the catal number of daca points " If the actual data points are y, and their corresponding; value on the fitted lin = 1
Refemng to the table of normal distribution (Appendix 1), the 95>o level is equivalent to 1.96 standard deviations. Then, a confidence band of 95" o. placed around the forecast line, is written 5)5% upper band=>,+ 1.960 95% lower band=>, 1.960 Figure 35a shows the soybean forecast with a 95" confidence band. The points that are outside the band are of particular interest and can be interpreted in either of two wajs. 1. They are not representative of normal price behavior and are expected to correct the levels within the bands. 2. The model was not performed on representative or adequate data and should be reestimated. Figure 35b also indicates that the forecast loses accuracy as it is further projected; the forecast is based on the size of the sample used to find the regression coefficients. The more data included in the original solution, the longer the forecast will maintain its accuracy NONLI¹AR APPROXIMATIONS FOR TWO VARIABLES Data points that cannot be related linearly may be approximated using a curve. The general polynomial form that spproximates any curve is y=ao+a+a2X+ ••• +a„x The flrsi two lerms flu iihi side «jf «iual sls" form rhe t snisbt line adding »< next term tli« shai>e of the rEuIcins ! abo/ic tiirve, one with a single, smooth change of direction. The diird I., i.i dtla »i tri/lBrtJtjr toth pattem Tor most price forecasting, the SA»2rur»v:/er «i»ation, alsu iaUcrd , 11 „; is suffi I » <liKure 6) The eom and soybean prices ftruni Tatili 31 wiU be used CO Kive 1 of thi and other nonlinear api>TixKiniaa:i<>iisThc curviiin ar must be solved for the coefficients a, b, and using the simultaneous equations FIGURE 35 Confidence bands, Soybeans with 95o confidence band, ) Outofsample forecasts lose confidence.
The . al.Lulai:i<>ns from Table 2 includEne the additional sums for x\ x. and c.). can I substituted into the preceding equations. The system of Itnulcaneous linear equations be solved l>y the process of matrix elimination, a technique thai should not be perCormr without «he help of a eomputer An alternate solution can be obtained by conUnumg wi the already familiar leastsquares method SECONDORDER LEAST SQUARES The concepts of least squares can be extended to the curvilinear (secondorder) equatkn by minimizing the sum of the errors* 5=Z iytabxcxy See Appendix 3 for examples ot mairix soluUoos. * F.R. Buckdeschel, BASIC ScienHfic S s. Vbl. I. (Byw/McGrmHlll, Petecboroueh, NH, 1981) FIGURE 36 Curvilinear (parabolas). Curvilinear (Second Order) T First, it is necessary to separate the various intermediate sums before expressing the solution for a, b, and c. The constant values can then be found by substitution into the following equations The procedure is identical to the linear leastsquares solution. Fortunately, there are simple computer programs that have alreadj been written to solve these problems. Using the program found in Appendix 2, the result is
[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [ 12 ] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205]
