You are on page 1of 59

Time Series Analysis and

Forecasting
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Introduction
A time series is a set of observations generated
sequentially in time
Continuous vs. discrete time series
The observations from a discrete time series, made
at some fixed interval h, at times t
1
, t
2
,, t
N
may
be denoted by x(t
1
), x(t
2
),, x(t
N
)

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Introduction (cont.)
Discrete time series may arise in two ways:
1- By sampling a continuous time series
2- By accumulating a variable over a period of time

Characteristics of time series
Time periods are of equal length
No missing values

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Components of a time series
trend pattern
seasonal pattern
cyclic pattern
statistical pattern
pattern component random (error) component
A time series
x
t
= F
t
+ x
t

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Areas of application
Forecasting
Determination of a transfer function of a system
Design of simple feed-forward and feedback
control schemes
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Forecasting
Applications
Economic and business planning
Inventory and production control
Control and optimization of industrial processes
Lead time of the forecasts
is the period over which forecasts are needed
Degree of sophistication
Simple ideas
Moving averages
Simple regression techniques
Complex statistical concepts
Box-Jenkins methodology
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Approaches to forecasting
Self-projecting approach



Cause-and-effect approach

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Approaches to forecasting (cont.)
Self-projecting approach
Advantages
Quickly and easily applied
A minimum of data is
required
Reasonably short-to medium-
term forecasts
They provide a basis by
which forecasts developed
through other models can be
measured against
Disadvantages
Not useful for forecasting
into the far future
Do not take into account
external factors

Cause-and-effect approach
Advantages
Bring more information
More accurate medium-to
long-term forecasts
Disadvantages
Forecasts of the explanatory
time series are required



Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Some traditional self-projecting models
Overall trend models
The trend could be linear, exponential, parabolic, etc.
A linear Trend has the form
Trend
t
= A + Bt
Short-term changes are difficult to track

Smoothing models
Respond to the most recent behavior of the series
Employ the idea of weighted averages
They range in the degree of sophistication
The simple exponential smoothing method:
t
t
1 t 1 t t
F
F ) A 1 ( Az z a + + =


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Some traditional self-projecting
models (cont.)
Seasonal models
Very common
Most seasonal time series also contain long- and short-
term trend patterns

Decomposition models
The series is decomposed into its separate patterns
Each pattern is modeled separately

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Drawbacks of the use of traditional
models
There is no systematic approach for the
identification and selection of an appropriate
model, and therefore, the identification process is
mainly trial-and-error
There is difficulty in verifying the validity of the
model
Most traditional methods were developed from intuitive
and practical considerations rather than from a
statistical foundation
Too narrow to deal efficiently with all time series
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
ARIMA models
Autoregressive Integrated Moving-average
Can represent a wide range of time series
A stochastic modeling approach that can be used
to calculate the probability of a future value lying
between two specified limits



Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
ARIMA models (Cont.)
In the 1960s Box and Jenkins recognized the
importance of these models in the area of
economic forecasting
Time series analysis - forecasting and control
George E. P. Box Gwilym M. Jenkins
1st edition was in 1976
Often called The Box-Jenkins approach

Box-Jenkins models
Univariate Multivariate (transfer function)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Transfer function modeling
Y
t
= v(B)X
t
where
v(B) = v
0
+ v
1
B + v
2
B
2
+ ..
B is the backshift operator
B
m
X
t
= X
t - m

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Transfer function modeling (cont.)
The study of process dynamics can achieve:
Better control
Improved design

Methods for estimating transfer function models
Classical methods
Based on deterministic perturbations
Uncontrollable disturbances (noise) are not accounted for,
and hence, these methods have not always been successful
Statistical methods
Make allowance for noise
The Box-Jenkins methodology


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Process control
Feed-forward control Feedback control

Control equation
z
t

b 1
B ) B ( ) B (
e o



1
2
1
1
B ) B ( L ) B ( L
+ f


Compen sating
variable X
t+

N
t

P
Deviation from target output

Deviation from
target output
Control equation
1
2
1
1
B ) B ( L ) B ( L
+ f


Compensating
variable X
t+

N
t

P
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Process control (cont.)
Disturbances
feed-forward contol
measued
feedback control
unmeasured
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Process control (cont.)
The Box-Jenkins approach to control is to typify
the disturbance by a suitable time series or
stochastic model and the inertial characteristics of
the system by a suitable transfer function model
The Control equation, allows the action which
should be taken at any given time to be calculated
given the present and previous states of the system
Various ways corresponding to various levels of
technological sophistication can be used to
execute a control action called for by the control
equation
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The Box-Jenkins model building
process
Model identification
Model estimation
Is model
adequate ?
Forecasts
Yes
Modify
model
No
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The Box-Jenkins model building
process (cont.)
Model identification
Autocorrelations
Partial-autocorrelations
Model estimation
The objective is to minimize the sum of squares of
errors
Model validation
Certain diagnostics are used to check the validity of
the model
Model forecasting
The estimated model is used to generate forecasts and
confidence limits of the forecasts



Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Important Fundamentals
A Normal process
Stationarity
Regular differencing
Autocorrelations (ACs)
The white noise process
The linear filter model
Invertibility
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
A Normal process (A Gaussian
process)
The Box-Jenkins methodology analyze a time
series as a realization of a stochastic process.
The observation z
t
at a given time t can be regarded as a
realization of a random variable z
t
with probability
density function p(z
t
)
The observations at any two times t
1
and t
2
may be
regarded as realizations of two random variables z
t
1
, z
t
2
and with joint probability density function p(z
t
1
, z
t
2
)
If the probability distribution associated with any set of
times is multivariate Normal distribution, the process is
called a normal or Gaussian process
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationary stochastic processes
In order to model a time series with the Box-
Jenkins approach, the series has to be stationary
In practical terms, the series is stationary if tends to
wonder more or less uniformly about some fixed
level
In statistical terms, a stationary process is assumed
to be in a particular state of statistical equilibrium,
i.e., p(x
t
) is the same for all t
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationary stochastic processes (cont.)
the process is called strictly stationary
if the joint probability distribution of any m
observations made at times t
1
, t
2
, , t
m
is the same as
that associated with m observations made at times t
1 + k
,
t
2 + k
, , t
m + k

When m = 1, the stationarity assumption implies
that the probability distribution p(z
t
) is the same
for all times t


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationary stochastic processes (cont.)
In particular, if z
t
is a stationary process, then the
first difference Vz
t
= z
t
- z
t-1
and higher
differences V
d
z
t
are stationary

Most time series are nonstationary

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Achieving stationarity
Regular differencing (RD)
(1
st
order) Vx
t
= (1 B)x
t
= x
t
x
t-1

(2
nd
order) V
2
x
t
= (1 B)
2
x
t
= x
t
2x
t-1
+ x
t-2
B is the backward shift operator
It is unlikely that more than two regular
differencing would ever be needed
Sometimes regular differencing by itself is not
sufficient and prior transformation is also needed
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Some nonstationary series
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Some nonstationary series (cont.)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Some nonstationary series (cont.)
How can we determine the
number of regular
differencing ?
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Autocorrelations (ACs)
Autocorrelations are statistical measures that indicate
how a time series is related to itself over time

The autocorrelation at lag 1 is the correlation between
the original series z
t
and the same series moved
forward one period (represented as z
t-1
)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Autocorrelations (cont.)
The theoretical autocorrelation function
| |
2
z
k t t
k
) z )( z ( E
o

=
+
The sample autocorrelation
,...k 2 , 1 , 0 k
) z z (
) z z )( z z (
r
N
1 t
2
t
k N
1 t
k t t
k
=

=
+

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Autocorrelations (cont.)
A graph of the correlation values is called a
correlogram
In practice, to obtain a useful estimate of the
autocorrelation function, at least 50 observations
are needed
The estimated autocorrelations r
k
would be
calculated up to lag no larger than N/4
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
A correlogram of a nonstationary
time seies
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
After one RD
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
After two RD
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The white noise process
The Box-Jenkins models are based on the idea that
a time series can be usefully regarded as generated
from (driven by) a series of uncorrelated
independent shocks e
t

| | | |

=
=
=
o = =
0 k 0
0 k 1
e var 0 e E
k
2
t t e
Such a sequence e
t
, e
t-1
, e
t-2
, is called a white
noise process

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The linear filter model
A linear filter is a model that transform the
white noise process e
t
to the process that generated
the time series x
t


Linear filter
) B (


x
t
White noise
e
t

t 2 t 2 1 t 1 t t
e ) B ( ... e e e x = + + + =

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The linear filter model (cont.)
1 with B ... B B 1 ) B (
0
0 j
j
j
2
2 1
= = + + + =

=


(B) is the transfer function of the filter
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The linear filter model (cont.)
The linear filter can be put in another form

=


+ t =
+ + t + t =
1 j
t j t j
t 2 t 2 1 t 1 t
e x
e ... x x x

=
t = t
= t
1 j
j
j
t t
B 1 ) B (
e x ) B (
t t
) B ( z a =
t t
z ) B ( a = t
) B ( ) B (
1
= t
This form can be written
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationarity and invertibility
conditions for a linear filter
For a linear process
to be stationary,

1 B for
converge must ) B (
s



If the current observation x
t

depends on past observations
with weights which decrease as
we go back in time, the series is
called invertible

For a linear process to be
invertible,
1 B for
converge must ) B (
s
t


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Model building blocks
Autoregressive (AR) models
Moving-average (MA) models
Mixed ARMA models
Non stationary models (ARIMA models)
The mean parameter
The trend parameter

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Autoregressive (AR) models
An autoregressive model of order p
t p t p 2 t 2 1 t 1 t
e x ... x x x + o + + o + o =

t t
e x ) B ( = |
p
p
2
2 1
B ... B B 1 ) B ( | | | = |
The autoregressive process can be thought of as
the output from a linear filter with a transfer
function |
-1
(B), when the input is white noise e
t
The equation |(B) = 0 is called the characteristic
equation
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Moving-average (MA) models
A moving-average model of order q
The moving-average process can be thought of as
the output from a linear filter with a transfer
function u(B), when the input is white noise e
t
The equation u(B) = 0 is called the characteristic
equation

q t q 2 t 2 1 t 1 t t
e ... e e e x

| | | =
t t
e ) B ( x u =
q
q
2
2 1
B ... B B 1 ) B ( | | | = u
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Mixed AR and MA (ARMA) models
A moving-average process of 1
st
order can be
written as
t 3 t
3
1 2 t
2
1 1 t 1 t
t t
3 3
1
2 2
1 1
t t
1
e ... x x x x
e x ...) B B B 1 (
a z
) B 1 (
1
+ | | | =
= + | + | + | +
=
u

Hence, if the process were really MA(1), we
would obtain a non parsimonious representation in
terms of an autoregressive model
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Mixed AR and MA (ARMA) models
(cont.)
In order to obtain a parsimonious model,
sometimes it will be necessary to include both AR
and MA terms in the model
An ARMA(p, q) model
t t
(B)
(B)
x e
|
=
The ARMA process can be thought of as the
output from a linear filter with a transfer function
u(B)/|(B), when the input is white noise a
t

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
The Box-Jenkins model building
process
Model identification
Autocorrelations
Partial-autocorrelations
Model estimation
Model validation
Certain diagnostics are used to check the validity of the
model
Model forecasting

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Partial-autocorrelations (PACs)
Partial-autocorrelations are another set of statistical
measures are used to identify time series models

PAC is Similar to AC, except that when calculating
it, the ACs with all the elements within the lag are
partialled out (Box & Jenkins, 1976)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Partial-autocorrelations (cont.)
PACs can be calculated from the values of the
ACs where each PAC is obtained from a different
set of linear equations that describe a pure
autoregressive model of an order that is equal to
the value of the lag of the partial-autocorrelation
computed

PAC at lag k is denoted by |
kk

The double notation kk is to emphasize that |
kk
is the
autoregressive parameter |
k
of the autoregressive model
of order k
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Model identification
The sample ACs and PACs are computed for the
series and compared to theoretical autocorrelation
and partial-autocorrelation functions for candidate
models investigated
Theoretical ACs and
PACs
Stationarity and
invertibility
conditions
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationarity and invertibility
conditions
For a linear process
to be stationary,

1 B for
converge must ) B (
s



For a linear process to be
invertible,
1 B for
converge must ) B (
s
t


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Stationarity requirements for AR(1)
model
For an AR(1) to be stationary:
-1 < o
1
< 1
i.e., the roots of the characteristic
equation 1 - o
1
B = 0 lie outside
the unit circle
For an AR(1) it can be shown that:

k
= o
1

k 1
which with
0
= 1 has
the solution

k
= o
1
k
k > 0
i.e., for a stationary AR(1) model,
the theoretical autocorrelation
function decays exponentially to
zero, however, the theoretical
partial-autocorrelation function has
a cut off after the 1
st
lag
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Invertibility requirements for a MA(1)
model
For a MA(1) to be invertible:
-1 < |
1
< 1
i.e., the roots of the characteristic equation 1 - |
1
B = 0
lie outside the unit circle
For a MA(1) it can be shown that:

i.e., for an invertible MA(1) model, the theoretical
autocorrelation function has a cut off after the 1
st
lag,
however, the theoretical partial-autocorrelation function
decays exponentially to zero





>
=
|
+
|

=
1 k 0
1 k
1
2
1
1
k


Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Higher order models
For an AR model of order p > 1:
The autocorrelation function consists of a mixture of
damped exponentials and damped sine waves
The partial-autocorrelation function has a cut off after
the p lag
For a MA models of order q > 1:
The autocorrelation function has a cut off after the q lag
The partial-autocorrelation function consists of a
mixture of damped exponentials and damped sine
waves
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Permissible regions for the AR and
MA parameters
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Theoretical ACs and PACs (cont.)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Theoretical ACs and PACs (cont.)
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Model identification
Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Model estimation

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray
Model verification

Time Series Analysis Lecture Notes
MA(4030)Prepared By TMJA Cooray

You might also like