back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [ 116 ] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166]


116

Forecasting with an AR(p) model could not be easier. One simply generates the one-step-ahead forecast and then uses this for a two-step-ahead forecast and so on

several autoregressive and/or moving average terms are included as possible explanatory variables, and the model is estimated.11 Then lags are dropped according to the significance of their coefficients. For example, estimating an AR(3) model on the log futures prices of natural gas that were shown in Figure 11.4 (and which we already know to be weakly stationary, from §11.1.5) gives:

y, = 0.0438 + 0.8922vv, + 0.0250j>, 2 + 0.0249y, 3 + £,.

(2.50) (14.34) (0.30) (0.40)

The figures in parentheses are -ratios and they indicate that while a first-order autoregressive term is significant, there is no reason to include higher-order autoregressive terms. Estimating an ARMA(1,1) model for the same data gives:

yt = 0.0423 + 0.9439y, i + e, + 0.058 le, ,.

(2.47) (42.43) (0.784)

As there is no good reason to include the moving average term in the model the AR(1) model is possibly the best representation of these data, and this is:

y, = 0.0473 + 0.9375j, , + e,.

(2.81) (42.92)

11.3.4 Forecasting with ARMA Models

Once a suitable ARMA representation of the data has been specified, the purpose of univariate time series is to use this model for predictive purposes. Forecasting with an AR(/>) model could not be easier. One simply generates the one-step-ahead forecast and then uses this for a two-step-ahead forecast and so on. That is, the optimal one-step-ahead prediction at time T is the conditional expectation of yT+l given { , - , }

+] -c = ( - ) + a2(yT-i -c)+ ... + ap(yT p+l - ),

and the two-step-ahead prediction is:

+2 -c = a{(yT+l - ) + a2(yT - c) + ... + ap(yT p+2 - c),

and so on.

An MA(q) model is useful for predictions only up to q steps ahead. Since eT+u eT+2, ... are unknown they are set to zero and the -step-ahead prediction for s q is

+s = + p>r + p\+i£r-l + • • +

For an ARMA(/?, q) model the -step-ahead predictions are

"Although pure AR models may be consistently estimated by OLS, the MA part of ARMA models requires the maximum likelihood estimation method (Appendix 6). The likelihood functions for different ARMA models are given in Hamilton (1994), but most statistical packages offer ARMA model estimation by maximum likelihood as a standard routine.



-404-.-.-.-.-

1-Jan-99 15-Jan-99 29-Jan-99 12-Feb-99 26-Feb-99

Figure 11.8 Percentage forecast error for rolling AR(1) models of log futures prices of natural gas.

9t+s ~c = (jV-i - c) + a2(yT+s 2 -c)+ ... + ap(yT+s p - c)

for s q. For s > q only the AR part determines the forecasts.

Forecasting a stationary time series using ARMA models is quite straightforward in practice.12 To illustrate the method, the AR(1) model of natural gas futures prices has been used to provide -step-ahead forecasts for 5=1, 2, . . ., 20 for the gas future price on every day from 1 January 1999 to 3 March 1999. This is the out-of-sample period indicated by the dotted line across Figure 11.4. The data from 1 January 1998 to 31 December 1999 are used to estimate the AR(1) model and then a daily forecast series is made up to 20 days ahead. The data are rolled one day at a time, and each time the AR(1) model is re-estimated and used to forecast up to 20 days ahead. In this way 24 different out-of-sample forecast series are calculated. Figure 11.8 shows the forecast errors from these 24 different forecasts over time. It shows that short-term forecasts are noticeably more accurate than long-term forecasts; in fact the forecasts generally stay within the 5% error bounds for one week ahead or even more.

The univariate approach to time series modelling can give some interesting results, but it is still only part of the story. A forecasting model for natural gas futures prices has been used to illustrate some concepts but the analysis here takes no account of any relation that these futures prices may have with the prices of other futures, or with the gas daily spot prices. Forecasting model that capture the important relationships between spot and futures prices will be explored in more detail in Chapter 12.

12ARMA model forecasting is an in-built procedure in most statistical time series packages.



11.4 Multivariate Time Series

A multivariate time series {y,} is a vector {(j1>(, y2j, , y„,t)} of n different time series processes that are measured concurrently. In addition to syn-chronicity of data on yu . . ., yn, it is normal to use time series processes that have the same basic characteristics. That is, the different processes in {y(} should all be stationary, or have the same order of integration. The case that {y(} contains integrated variables is discussed in Chapter 12. This section describes the basic models of stationary multivariate time series that are useful for modelling dynamic co-dependencies between the n different time series.

11.4.1 Vector Autoregressions

The general ARMA models for stationary series were introduced in §11.2. There it was shown that a moving average process of any order is invertible into an infinite AR model, so any stationary univariate time series may be modelled by an autoregressive model provided it contains sufficient lags to capture all of the dynamics.

The same is true for a multivariate time series. The generic model is a vector autoregression (VAR), which simply extends the AR model to a multivariate time series. For example, a vector autoregression of order 7 on a bivariate system is

\,1 = dm + CtiiJu-i + ai272,,-l + El,r,

y2j = a20 + a2i + a22j2,,-i + t-if or, in matrix notation

y, = a0+Ay, , +£„ (11.28)

where y, = (ylh y2j), a0 = (a10, a20), £, = (eL„ ev) and

I a1l a12 V a21 CC22

More generally, a VAR(l) for an w-variate system is (11.28) where y„ Oq and £, are n x 1 vectors and A is an n x n matrix of coefficients. The general VAR() model is

y, =a0 + A,y( 1 + ... +Apy, p + e,. (11.29)

Estimating the parameters in a VAR model is straightforward, assuming the errors are i.i.d. processes. Each of the n equations in (11.29) may be estimated separately by OLS to give consistent and asymptotically efficient estimators (§A.1.3).

The specification procedure for a VAR model is quite simple: first determine the variables to use in the system {y,} and then test down the AR lag specification in each equation until a reasonable fit is obtained (§11.3.3). Once specified, the model may be used to gain insights into the dynamic relationships



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [ 116 ] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166]