back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [ 178 ] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


178

purpose of the analysis is to see how the variance of X, is distributed among oscillations of various frequencies. The technique of analysis is called spectral analysis.

Time-domain methods are based on direct modeling of the lagged relationships between a series and its past. The methods are similar to the type of analysis discussed in earlier chapters and involve fitting of linear autoregressions and cross-regressions. In any case it would be more difficuh for beginning students to understand the spectral methods. Hence this introductory chapter on time-series deals with time-domain methods only.

13.3 stationary and Nonstationary Time Series

From a theoretical point of view a time series is a collection of random variables {X,}. Such a collection of random variables ordered in time is called a stochastic process. The word stochastic has a Greek origin and means "pertaining to chance." If it is a continuous variable, it is customary to denote the random variables by X(t), and if r is a discrete variable, it is customary to denote them by X,. An example of the continuous variable X(t) is the recording of an electrocardiogram. Examples of discrete random variables X, are the data on unemployment, money supply, closing stock prices, and so on, that we mentioned earlier. We will not go into the theory of stochastic processes here in great detail. We will just outline those elements that are necessary for time-series analysis. Furthermore, we will be considering discrete processes only, and so we shall use the notation X, or X{t) interchangeably.

The random variables {X,} are, in general, not independent. Furthermore, we have just a sample of size 1 on each of the random variables (e.g., if we say that the unemployment rate at the end of this week is a random variable, we have just one observation on this particular random variable). There is no way of getting another observation, so we have what is called a single realization. These two features-dependence and lack of replication-compel us to specify some highly restrictive models for the statistical structure of the stochastic process.

Strict Stationarity

One way of describing a stochastic process is to specify the joint distribution of the variables X,. This is quite complicated and not usually attempted in practice. Instead, what is usually done is that we define the first and second moments of the variables A",. These are;

1. The mean: x(r) = E{X,).

2. The variance (rit) = var{X,).

3. The autocovariances -/(/„ tj) = cov(Z,„ X,2).



Weak Stationarity

For a strictly stationary time-series the distribution of X(t) is independent of Thus it is not just the mean and variance that is constant. All higher order moments are independent of t. So are all higher order moments of the joint distribution of any combinations of the variables X{ti), XUi), Xit), .... In practice this is a very strong assumption, and it is useful to define stationarity in a less restrictive way. This definition is in terms of first and second moments only.

A time series is said to be weakly stationary if its mean is constant and its flcv/depends only on the lag, that is,

E[X{t)] = ii and cov[X{t), X(t + k)] = y{k}

No assumptions are made about higher-order moments. Alternative terms used for this weakly stationary time series are wide-sense stationary, covariance stationary, or second-order stationary.

If X{ti), Xiti), . . . , X{t„) follow a multivariate normal distribution, since the multivariate normal distribution is completely characterized by the first

When = = t, the autocovariance is just a\i). One important class of stochastic processes is that of stationary stochastic processes. Corresponding to these we have the concept of stationary time series. A time series is said to be strictly stationary if the joint distribution of any set of n observations X{tx), X{t2), .... X(t,) is the same as the joint distribution of X(/, -I- k), X(t2 + k) , X{t„ + k) for all n and k.

The definition above of strict stationarity holds for all values of n. Substituting / = 1, we get x(/) = x a constant and a\t) = a constant for all /. Furthermore, if we substitute / = 2, we get the result that the joint distribution of and Xiti) is the same as that of + k) and Xttj + k). Writing /, -i-k = t2, we see that this is the same as the distribution of Xitj) and Z(f2 + k). Thus, it just depends on only the difference (/2 - /), which is called the lag. Hence, we can write the autocovariance function y(t„ as y(k) where = /2 - tl the lag. Thus y(/c) = cov[X{t), X{t + k)\ is the autocovariance coefficient at lag k. y{k) is called the autocovariance function and will be abbreviated as acvf. 7(0) is, of course, the variance of cr.

Since the autocovariance coefficients depend on the units of measurement of XU), i\ is convenient to consider the autocorrelations that are free of the units of measurement. Since var X(t) = var ACr + k) = = 7(0), we have the autocorrelation coefficient p(A:) at lag as

p{k) is called the autocorrelation function and will be abbreviated as acf. A plot of p{k) against is called a correlogram.



13 3 STATIONARY AND NONSTATIONARY TIME SERIES

X, 26

Year t

Figure 13.1. A stationary series. X, = 0.7,,, + e,; e ~ IN(5,1); = 16.7.

and second moments, the two concepts of strict stationarity and weak stationarity are equivalent. For other distributions this is not so. In Figure 13.1 we show the graph of a stationary time series. It is the graph of X, where X, = 0.7 , , -I- e, and e, ~ IN(5, 1). Because of the assumption of normality, this time series is strongly stationary.

Properties of Autocorrelation Function

There are two important points to note about the autocorrelation function acf. These are the following:

1. The ac/is an even function of the lag [i.e., p(A) = p(-A)]. This follows from the result

y{k) = cov(A„ X,+ = covCXjt, AT,) because of stationarity = y{-k)

2. For a given acf, there will be only one normal process. But it is possible to find several nonnormal processes that have the same acf. Jenkins and Watts give an example of two different stochastic processes that have the same acf.

G. M. Jenkins and D. G. Watts, Spectral Analysis and Its Applications (San Francisco: Hol-den-Day, 1968), p. 170.



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [ 178 ] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]