Introduction to Time Series Analysis. Lecture 2.
Peter Bartlett Last lecture: 1. Objectives of time series analysis. 2. Time series models. 3. Time series modelling: Chasing stationarity.
Introduction to Time Series Analysis. Lecture 2.
Peter Bartlett 1. Stationarity 2. Autocovariance, autocorrelation 3. MA, AR, linear processes 4. Sample autocorrelation function
Stationarity
{Xt } is strictly stationary if for all k, t1 , . . . , tk , x1 , . . . , xk , and h, P (Xt1 x1 , . . . , Xtk xk ) = P (Xt1 +h x1 , . . . , Xtk +h xk ). i.e., shifting the time axis does not affect the distribution. We shall consider second-order properties only.
Mean and Autocovariance
2 Suppose that {Xt } is a time series with E[Xt ] < . Its mean function is
t = E[Xt ]. Its autocovariance function is X (s, t) = Cov(Xs , Xt ) = E[(Xs s )(Xt t )].
Weak Stationarity
We say that {Xt } is (weakly) stationary if 1. t is independent of t, and 2. For each h, X (t + h, t) is independent of t. In that case, we write X (h) = X (h, 0).
Stationarity
The autocorrelation function (ACF) of {Xt } is dened as X (h) X (h) = X (0) Cov(Xt+h , Xt ) = Cov(Xt , Xt ) = Corr(Xt+h , Xt ).
Stationarity
2 Example: i.i.d. noise, E[Xt ] = 0, E[Xt ] = 2 . We have 2 if h = 0, X (t + h, t) = 0 otherwise.
Thus,
1. t = 0 is independent of t. 2. X (t + h, t) = X (h, 0) for all t. So {Xt } is stationary. Similarly for any white noise (uncorrelated, zero mean), Xt W N (0, 2 ).
Stationarity
Example: Random walk, St = i=1 Xi for i.i.d., mean zero {Xt }. 2 We have E[St ] = 0, E[St ] = t 2 , and S (t + h, t) = Cov(St+h , St )
h t
= Cov St +
s=1
Xt+s , St
= Cov(St , St ) = t 2 . 1. t = 0 is independent of t, but 2. S (t + h, t) is not. So {St } is not stationary.
8
An aside: covariances
Cov(X + Y, Z ) = Cov(X, Z ) + Cov(Y, Z ), Cov(aX, Y ) = a Cov(X, Y ), Also if X and Y are independent (e.g., X = c), then Cov(X, Y ) = 0.
Random walk
10
10
15
10
15
20
25
30
35
40
45
50
10
Stationarity
Example: MA(1) process (Moving Average): Xt = Wt + Wt1 , We have E[Xt ] = 0, and X (t + h, t) = E(Xt+h Xt ) = E[(Wt+h + Wt+h1 )(Wt + Wt1 )] 2 2 (1 + ) if h = 0, = 2 if h = 1, 0 otherwise.
11
{Wt } W N (0, 2 ).
Thus, {Xt } is stationary.
ACF of the MA(1) process
MA(1): Xt = Zt + Zt1
0.8
0.6
0.4
/(1+ )
0.2
0 10
10
12
Stationarity
Example: AR(1) process (AutoRegressive): Xt = Xt1 + Wt , {Wt } W N (0, 2 ).
Assume that Xt is stationary and || < 1. Then we have E[Xt ] = EXt1 =0 (from stationarity)
2 2 2 E[Xt ] = 2 E[Xt 1 ] +
2 = 1 2
(from stationarity),
13
Stationarity
Example: AR(1) process, Xt = Xt1 + Wt , {Wt } W N (0, 2 ). Assume that Xt is stationary and || < 1. Then we have 2 E[Xt ] = 0, = 1 2 X (h) = Cov(Xt+h1 + Wt+h , Xt )
2 E [X t ]
= Cov(Xt+h1 , Xt ) = X (h 1) = |h| X (0) |h | 2 = . 2 1 (check for h > 0 and h < 0)
14
ACF of the AR(1) process
AR(1): Xt = Xt1 + Zt 1
0.9
0.8
0.7
0.6
0.5
0.4
0.3 0.2
|h|
0.1
0 10
10
15
Linear Processes
An important class of stationary time series:
Xt = +
j =
j Wtj
where and
2 {Wt } W N (0, w )
, j are parameters satisfying
|j | < .
j =
16
Linear Processes
Xt = +
j =
j Wtj
We have X =
2 X (h) = w j =
j h+j .
(why?)
17
Examples of Linear Processes: White noise
Xt = +
j =
j Wtj
Choose
, 1 if j = 0, j = 0 otherwise.
2 ). Then {Xt } W N (, W
(why?)
18
Examples of Linear Processes: MA(1)
Xt = +
j =
j Wtj
Choose
Then Xt = Wt + Wt1 .
=0 1 if j = 0, j = if j = 1, 0 otherwise.
(why?)
19
Examples of Linear Processes: AR(1)
Xt = +
j =
j Wtj
Choose
Then for || < 1, we have Xt = Xt1 + Wt .
=0 j j = 0
if j 0, otherwise.
(why?)
20
Estimating the ACF: Sample ACF
Recall: Suppose that {Xt } is a stationary time series. Its mean is = E[Xt ]. Its autocovariance function is (h) = Cov(Xt+h , Xt ) = E[(Xt+h )(Xt )]. Its autocorrelation function is (h) = (h) . (0)
21
Estimating the ACF: Sample ACF
For observations x1 , . . . , xn of a time series, n 1 xt . the sample mean is x = n t=1 The sample autocovariance function is (h) = 1 n
n|h|
(xt+|h| x )(xt x ),
t=1
for n < h < n.
The sample autocorrelation function is (h) = (h) . (0)
22
Estimating the ACF: Sample ACF
Sample autocovariance function: 1 (h) = n
n|h|
(xt+|h| x )(xt x ).
t=1
the sample covariance of (x1 , xh+1 ), . . . , (xnh , xn ), except that we normalize by n instead of n h, and we subtract the full sample mean.
23
Introduction to Time Series Analysis. Lecture 2.
1. Stationarity 2. Autocovariance, autocorrelation 3. MA, AR, linear processes 4. Sample autocorrelation function
24