# Econometric models and economic forecasts

p^Rr IIf

The model could, ofcourse,be uscdto perlbrm <.rthcr
forccastingexpcrimcnts.
For example, the effectsof a change in the propensity to consumc could bc
measuredby changing the coefflcient,multiplying disposableincome net of,
transferpaymentsin the consumptionequation,and then simulatingthe model.
Readersmight want to use this model to perform their own simulation experi_
ments.

r\J--rf-t

MODELS
TIME-SERIES

EXERCISES
lr.l ShowthatiI.4,),'1and12L! arebothtansientsolutions
to a model,i.e.,bothsatisfy
an equationsuchas (13.5),then the sum.ArIi + ,4rI, mustalsobe a solution.
13.2 Considerthe following simplemultiplier-accelerator
macroeconomic

model:
C1= a1*

a 2 Y1 _ 1

I t = b1' l b2( C '- C p l l

\:

q+

L+

Gt

Note that investment is now a function of changes in consumption, rather than of
changesin total GNP.
(a) Determine the characteristic equation for this model, and find the associated
characteristic roots.
(r) Find the relationships between values of c, and ,2 that determine what kind of
solution the model will have, Draw a diagramthat correspondsto that ofFig. 13.1.
(c) What is the impact multiplier cofiesponding to a change in cr? What is the ,or4l
long-run multiplier corresponding to a change in Gr?
lr.t The following equations describe a simple ,,cobweb,' model of a competitive
market:
D€mand:

Ql = a1 + a2p,

supply:

Ql: b1 + b2P,-y

4z 1O

bz) o

When the market is in equilibrium, qf = ef . Now supposethar the market is temporarily
out of equilibrium, i.e., rhat el + ef temporafily.
(a) Show that the price will converge stably to an equilibrium vah)e if b2la2 < I.

(r) Show that the path to equilibrium will be oscillatory if ,, > 0 and will not be
oscillatory if ,2 < 0.

how econometricmodels-both sittglcIn the first two partsof this book we saw
constructctt
u"a multi-equation models-can be
ffi;;;;;il;;
variablcs'
or-more
of one
-oJ"rt
forecastthe i"ttltt
and usedto explain and
thcm li)r
using
and
-ou"rn"tt* models
in constructing
In Part Threewe are againinterested

.#ff #iffilJ:
i#ffi;,il;;**T1.,':1H.;:f
'}ff
:1""-T,lifl
earlier. we no longer PredlctIuIl

set of other variablesi" u tut'ttuf ituln"work;
,ot.tn o.t the past behavior of the vadable'
the figure on page4 l4'
considerthe time series/ (l) drawn in
oi
or busirtcss
the historicalperformanceof someeconomic pcrlrnl)s
*tri.t "" "l<""ior..
or
index,
a production
-igil.Jp.asent
stock market index, u,' i,'..,.,'.u..,
variable_a
or dowrl
up
moved
have
for somecommodity.y(r) might
,i.'lu v ,a", uorume_
(or so
rates
interest
and.
income'
partly in responseto changesln pdces'perional
to
duc
been
have
may
its movement
we might believe). However' much of
Sinlp|y
or
in
taste,
the weather, changes
faclorstJratwe cannot explain, such aS
spending'
in
cycles
seasonal1or aseasonal)
*re movement of y-(t)through lltt'
It may be difficult ot irnpo"tol" to expiain
data alc rr('l
This might happen if, for. example.
;,^;;ui
y(t)' or il
affect
to
believed
*^ftitft are
"r;'#:
-oa.i.
availablefor those explanatory
result irl
""ti"'Uftt
might
for
/11)
the estimition of a regressionmodel
i"i" *.t"
coefficictrts
€stimated
the
of
"""ii"ule,
that are so tu'g" *-to *Jkt Inot'
standarderrors

il:i;'##;J,h.

large'
unacceptablv
iiuttauta-t..otof forecast

Evenifwecouldestimateastatisticallysignificantregressionequatio
forecasr
iot forecastingpurposes'ro obtain a
t"
;;;oi
,(;"il;;i
laguc(l
not
are
that
""rt'i
for v(t) from a regressrontqttuiiott' ttpluttutoliuuiiubles
4'13

r t t r r " l 'l t{,
( d l l tl l l i ) I( l l o sl ) cl l ( l r }tl l l tc ttttttl t'l l ttg Il t
h o w t t t t t t l t l i l tl c ,l l l ( l r l l r l l {y w l

.i1,,urt""
,,,r,'1at"
ffir,\*;;;'i;,r
ill'
ill'li,i:I
lXill*:llllllllllil'iii,ilii
tttrrlct
rt't
a.bcr
ittcltrdc
t"tultmighr
t h c c o l t s t rtl ( l i o l l o l ,l l l l l l l l l c( l l l 'l l

tt'ot
ni',t.tti,n.'it',1s.,i,,,.
il;;tiil;;i ii;;r,l
to
ai wcllasthcabilirv
inuutu"tt
t'tiu'"
u'ut
'rnk('
ii'J'ite ,;Liit;':.'inii,'u'tti1"
by
lllc
'rl
orrtwciglrctl
bc
thcsc gains n]ay
fbrecast.llowcvcr, lll sotl'Iccascs
a better

*titlHl]r""l.,
.l;;;;;;;;i,h.

is knownalr,rtl lltt'
modelwill usuallybe chosenwhenlittle'
lalscar)r()rrrrr
anda sufficicntlv
viriableof primaryconcern

g*rii#x*:l"ll*'."1'ru'ff;"j.;;::',",l

must themselvesbe forecasted,and this may be more difficult than forecastinJl
, (/) itself.The standarderror of forecastfor, (/) with future valuesof the explan
atory variablesknown may be small. However, when the future values of th('
explanatoryvariablesare unknown, their forecasterrorsmay be so Iarge as trr
make the total forecasterror for r(t) too large to be acceptable.
Thus there are situationswhere we seekan alternativemeansof obtaining a
forecastof y(t). Can we observethe time seriesin the figure and draw somc
ts ptobablefuture behavior?For example,is there somekind of overall upward
trend in 1l(t) which, becauseit has dominated the past behavior of the series,
might dominate its future behavior?Or doesthe seies exhibit cyclicalbehavior
which we could extrapolateinto the future? If systematicbehaviorof this type is
present,we can attempt to constructa modelfor the time serieswhich doesnot
offer a structuralexplanationfor its behaviorin termsof other variablesbut does
replicateits past behavior in a way that might help us forecastits future behavior. A time-series
modelaccountsfor patternsin the past movementsof a variable
and usesthat information to predict its future movements.In a sensea timeseriesmodel isjust a sophisticatedmethod of extrapolation.Yet,aswe will seein
this part of the book, it sometimesprovidesan effectivetool for forecasting.
In this book we have divided forecastingmodels into three generalclasses,
eachof which involves a different level of comprehensionabout the real world
processesthat one is trying to model. In Pa One we discussedsingle-equation
regressionmodels,where the variable of interestis explainedby a singlefunction (linear or nonlinear) of explanatoryvariables.In Pa Two we examined
multi-equation models,where two or more endogenousvariablesare relatedto
each other (and perhapsto one or more exogenousvariables)t]rrough a set of
equations,which can be solvedsimultaneouslyto produceforecastsover time.
In this third part of the book we focus on time-seriesmodels,in which we have
no structuralknowledgeabout the real world causalrelationshipsthat affectthe
variablewe are trying to forecast.
Often a choicemust be made as to which type of model should be developed
to bestmake a forecast.This choicemay be difficult and will dependnot only on
how much we know about the workings of the real world processbut also on

*o"ta te to bnild a regressionmotlcl' 't
nroduction of a commodity' or" iitoitt
thirdchoiccis r(l
t; builda time-serie'
T"qet'^":9,.1
i#fi"Jnil*Jouil;"
will see' onc cnrr
Al
tirs analysiswith regressionanalysis'
ir*[ifii^"-t
-we^

j:ftilffili: ;:;,'iJ
.i"ii,".i'l'i.ei.,,ro,,
ln:*m'*ffi'j
-od.r'
cor
then
and
variables
economic
ior of the residualterm o"- tn;"tt"titt"ti"'X;roduction to the scienceand art ,l'
The tollowingthlp't]t^lill 1",
th'rt wt'
ourposesof forecastingThe model''

r'rrr'''.,
i::ffiiil?;ffi;':'::il::;i'.'il":".'J'"il;iiil4
"fi:::hniqucs

r"ril,1"..::l:
;;;examPle,
;;"i;fi
::l,i'l
s;:i:T*lln
\$JiT:fi1?####;,
recentd€v1
for
discuss,

;:''l
::xTi
'lri:1T:I
f*y**::fiH:TJ:#'i*:'.''x1:"ff
itri::iff

found wide applicationto economic
the developmentof the.sinele-equatiott
ott
uttila'
Sincetime-seriesunuty'l'
in tire last pari o{ the book' evctt
otut tt-t-tiiitt
reeressionmodel, we-"'i'ple't';
ol
explanation
in term:
models
ctus'of-odels

;il;ff;ft'
t"*'*:
interesl
a snort-term
therealworld.Toforecast

1,:1?1 a regressro
^Tlg.lt-lse

plgffi;:?l
to,or
uu,raure
;il;il ;;;.,hut
*i"ffi:ili:i
il:,T:il'"l'JlJll
for intetestratesw""H:.:llt:j*;;il;;;;;.
rike most regressior

il;
describethe random nature ol :
","del,
that'must be estimatc(l'
coefficients
of
u
ttt
to'-"uirr'in!
models,is an equation
nonlinear in tltt'
usuallv
is
ttt*t%t'-tht
ilii;';;;til-t"J"it'
lor
ordinary leastsquaresis necessary
of "q'''ution
coefficients,so that a nonllnear version

*"#lJfi#T""t't*

(irr
or'tTIr: :i:'."ei:lon methods
witha bdersuwev
arrtl
tethods for smoothing

s".i.ri, u, *.tiut
effectd.eterministicmooetsortime "i.itipolation techni-queshave been uscil
*tiit
ut""
and ycl
to-t applications f1:ld:.,1 simple
widely for many years u"o iot-

::"ilil
*rji"ffi ;#l'"*1",::lif:;'.*'e'.:ifi':'1

\San Francisco:Holden-Day'
r c. E. P. Box and G M Jenkins' Title seriesAndlysis

1970)

In chapter l5 wc prcsc't a bricl i.rrod.ctior)to thc
o[ srochastic
tirrc
'alurc
sedes.We discusshow stochastic
processcs
arc gcnerated,
whaI thcy look Iikt,,
and mostimportant,how they are described.
We alsodiscusssomcof thc char
acteristicsof stochasticprocessesand in panicular develop the conceptof
sl,l
tionadty. Then we describeautocoffelationfunctionsand show how thiy carr
l,r,
used_as
a meansofdescribingtime seriesand asa tool for testingthelr propcrtrc\.
Finally, we discussmethods of testingfor shtionarity, and we-discussthe
corr
ceptof co-integrated
time series.The conceptsand tooli developedin this chaprcr
are essentialto the discussionof time-seriesmodelsin the chaptersthat follow
Ciapter 16 developslinear modelsfor time series,including movrng averag(.
models, autoregressivemodels, and mixed autoregressiveimovingaverag(
mod€lsfor stationarytime series.We show how somenonstationaryume serrc,
can be differencedone or more times so as to produce a stationaryseries.This
enables us to develop a general integrated autoregressive_movingaveragc
model (ARIMA model). Finally, we show how aurocoirelationfunctlonscan
bc
usedto specif,,and characterizea time-seriesmodel.
Chaptersi7 and l8 deal with use of time-seriesmodels to make forecasts.
Chapter l7 explainshow parametersof a time-seriesmodel are estimatedand
how a specificationof the model can be verified. Chapterl8 discusses
how the
model can then be usedto producea forecast.We also show how time seriesare
information. The last part of Chapter tg dealswith forecait errors and shows
how confdence intervals can be determinedfor forecasts.
The lastchapterofpart Threedevelopssomeexamplesofapplicationsof time_
step
by stepthrough the constructionof severaltime-siriesmodelsand their applica_
tion to forecastingproblems.

I
CHAPTER

=

AND
SMOOTHING
EXIRAPOLATIONOF
TIME SERIES

a Ume-seriesmodel is a sol)lrisli"
As explainedin the introduction to PartThree'
however' when lcsssol)lris
times'
are
data There
.i.i'"".itt.a
purposesFor cxlrllforecasting
for
"r.xtrapolating
used
extapolaiion can be
,1.",.J-aU"At
quicklv's(
needed
be
might
"f
series
time
of
i;t uluie.

ffi';;;j::;ffi

""-u*
of formal modeling-techniqllcs' ol
that time and resources do not permit the use

il;il;";;1"".:b_"ly.,
ji:ilfi ';;ff,l*il.*:::ili,ff
,ililil
fiend, thus obviating the need

extrapolaliot
s"omesimple (and not so simple)methodsof
Ut ait."tfi
i.*
:erit's'
time
of
nodels
deterministic
a*t.upotrtiontechniquesrepresenl
it?i.
"';;.;;'";;
scrit'sattt
time
a
smoorh
to
desirable
is
also siruationswhen ir
Smoothirl
someof rhe mnre volatile shoft-term fluctuations
ttr"t.Uyift-i"".
casr(
series
dme
the
make
to
simply
or
ligfri i. a""" t.f"re making a forecast
scasorr
to remove
and interpret smoothing might also be done
;?;"iy;;
series wc wil
time
a
s'io"otty
@7
fluctuations, i.e', io deseasonauzi
in the secondsectionofthis chat)1(

MODELS
SIMPLE EXTRAPOLATION
to forecasta-time series('r) llt
We begin with simple models that can be used
i
r4.T

il thatno referer)c
illo".?iit'i,"" ilii"tioi. irtttt modelsaredeterministic

have been stand'r
t""dels involve extrapolationtechniquesthat
is.*.ffy,n.
Although tlrt'
years
forecastingfor
stochasl
modern
as the
a" not provide as much forecastingaccuracy
A1
"1""fiv

14.l.l Slmplc ExtrapolatlonModels
one basicchara(tfrlstkol'.y,ls lls long'rtlnSrowthpattern.ll'wc bcllcvcthal thls
upwardtrcnd cxislsattdwill cotrtlnuc(andtheremay not be any rcasonwlly wc
that trcnd antl catt bc
a silnplcmodel that describes
should),we can collstr'r.lct
y,.
usedto forecast
The simplestextrapolationmodel is tbe lineartrendmodtl If we belicvc that a
seriesy, will increasein constant absoluteamounts each time period, we calr
predictyr by fitting the trend line
Cr+ C2I

h=

time-seriesmodels, they often provide
a simple, inexpensive,and still quite
acceptablemears of forecasdns.
Most of the seriesthat we e;ounter
u, ,.grlu. i"i;_;r"r;;#.
; rypicat dme
senesmight be given by Fig. t4. t,.
We a"ri,rt. tn" ,"i*rlf
tt
Uy y,, ,o
tnat y,. representsthe firsr observation,
"-t'r.ri.,
the second,una y. ,nJj^,
,2
oUr"_urion
Ior_tre series.rour objecriveis
[o modet rhe series;;lil;;;r,
modet ro
lorecast Jrrbeyond the last observation
y.. We Aerrote'ttr.*foil-casr one pe.ioO

ip.n"u, tilj
l, y,.,.
If the numberof observationsis not !i*'r,
""a the simf
roolarge,
uri?ort
.o_pt.r"
representation
ofy, wouldbe eivenby p.ry;r;;l ;;;r;O"r*"
"ri
,
lessrhan
the numberof observations;
" d.r.riu"
i- *" ..rra
" tunction
;;;;;.;ft;"r."s
of time/(r), where
f (tl = ao+ a\t + a2t2+ .

.+ ant"

;;;#N:#i';::
';:::i:Y,Y::"Tnl::.,'cu"*e.tr-o*.u..:ffi
t::'::::t!:,fl":::x\*-ovr(rt*i'i""t'"il;d;i;;i::il'Jff
ffi:?::iiT"i
example, will the forecaii
f(T + l) = ao+ a\(T+ t) + a2(T+ l)2 +.

. + ar_L(T+ t)r-,:

where I is time and y, is the value of / at time t. t is usually chosento equal 0 itr
the baseperiod (fust observation)and to increaseby 1 during each successivc
period. For example,if we determineby regressionthat
h:

jr*t

,, f1+\ )t t -J \ L t -^ .

(14.t)

^ -rl

( 14. 4)

Ilere -4 and / would be chosento maximize the correlationbetween/(r) and /,.
A forecastone period aheadwould then be given by
= /2rlT+ | |

( 14.5)

ir+t : Ae{r+tl

( 14. 6)

Yr*,

This is illustratedin Fig. 14.2.The parametersI and r can be estimatedby taking
the logarithms of both sidesof Eq. (14.4) and fltting the log-linear regression
eouation2

De close to the actuaj furure value
/r_r? Unfonunatelv.
. we hau
--'e no way of
-,r+r.^,,*-^-r:.^.

ii,,i"r
pi..iJ"","l;;l;i:ffi:'",;,;,f,
iffi*XfH::ff:,:;,XlTi:ig1f
,,i1,]1,;,,ill,lil*']"i;E

27.5 I 3.2t

we can predictthat the value of/ in period t + I will be 1.2 units higher than thc
previousvalue.
It may be more realistic to assumethat the seriesyr grows with constant
percentageincreases,rather than constantabsoluteincreases.This assumption
implies that y1follows an exponmtialgrowth curve:

(14.1)

J are
arc L.'uscl
j";,),y**?,"jr""":Tjl(ifthea,s
chosen
conecuy)
correcuy)
wil pass
will
pass
through everypoinr
?X!.X.iJ,;
it tttaii."a ra,--,1"s,
we
can
be
sure
th"tf frt,*ifi"iu"i
y, ar everv time r fr.- , _^ ;r^1r1.
''

(r4.2)

log yt:

c1 t c2t

(r4.7|

li*::,.,j[1,,LiLl]1;l[t,lii;;F;:iffi ::,';,fff
[':
il;IliT'"P."11J::."^1,1,:11".:",,i,-"r";;;;;J#;1ffi
arthoush
iro'".."r"i*
p",iJ;;';;fi;;:Ti:TTTil"j,'l

fiJ:::;,::"

forecasting.

I In Part Tlx.ee
of the book we use small letters, for example,
/r, to denote time series.

where c1 : log A and c2 : r.

H:

is assum€dto grow at a constanl
'?Note that in the exponential growth model the logarithm oft'
Iate.ll h*t = Ae", t}j.erltnlyt = e', and IoEyt+r - log y = r.

ratcol'growlhol'thc
lfcr is fixedto l)c 0, lllcll lhc valltlrol'fr ls the compotlnded
thc auioregresslv
on
bascd
gurll
(\llnpotln(l
cxtrapolatlon
llrtt',trntttl
s.ri"r y.
forecasting.'
of
mcans
modefarc comtrxrttlyrlsc(lRsn sllnplc
yr (or
involveregrcssing
Note that thc lirui rnodclstlcscrlbcdabovebasically
and/or itsclflaBScd
los v,) asainsla lunctiotrol tilnc (linearor exponential)
slightly rrxrrc
funcrion
the
making
by
;;a.ts can bc dcveloped
;tiJrilil
simPle extrapolati()tl
a"r"ofi*,"a. As examples, let us examine two other
curve
growth
logistic
the
artd
linear trend modcl an(l
-oa"ft,
ift. orrua."t. rend model is a simpleextinsion of the
involvei adding a term in t2

A third extrapolation

( r 4.l o

yt =q+c2t +Ct t 2

FIGURE
14.2
Exponential
growthcurve.

method is based on the autoregressivetrend mod.el
\= q + c 2 y F t

( 14.8)

In-using such an extrapolationprocedure,one has the option of fixing c1: Q,lp
which casec2representsthe rate of changeof the seriesy. tf, on the other hand,
12is set equal to l, with cr not equal to 0, the extrapolatedserieswill increaseby
the sameabsoluteamount eachtime pedod. The iutoregressiveuend model is
illusftated in Fig. f4.3 for three different valuesof c2(in all casesc, = 11.
A variation of this model is Ihe logaithmic autoregressive
trend model
log y, = c1 I

c2log y7_1

(r4.e)

but even more rapidly
If c2and caare both positive,y1will alwaysbe increasing'
fust decreasebut latcr
y,
at
will
positive'
ca
and
goat o.t. if c2is negative
Th€ various cascs
"r',i-a
iara.""ralff bo,h ,, and caare negative,yl will alwaysdecrease'
if the data show
even
that
Note
case;'
(cr
>b
in
eich
ur" iU"tt.","a i" f ig. I4.4
( 14'I0) miSht
Eq'
of
esdmation
fime'
over
genera\$ been increasing
tn"t yr t
(asshown
",
ualue for cr but a negativevalue for 12' This car occur
vield a posiiive
only span.aponionof the uend c'trvc'
ir"
- ilg. i+.al u..ruse re iata utuully
ai leastin termsof its estimation'is thc
r4odel,
complicated
eio-"#n",.nore
logisticcufle, given bY
I

T+au

(14.ll)

b>Q

14.4
FIGURE
c2 <0 ,cr > 0

tn thls cxamplc
Dlp.rlmant gtolr g!lc!
Examplc 14.1 lor.o[tlng
salcsol'(lc'
retall
monthly
forecast
simplccxtrapolatlotlttlo(lcls0re uscdto
obscrvatlon
monthly
where
below,
ou.i-ent stores.'l'hclltnc scrlcsls listcd
atlittstcdand covcr the periodfrom January 1968 to March
ire seasonally
1g74,lheunits of mcasurcmcntare millionsof dollars,and the sourccol thc
data is the U.S. Departmentof Commerce.

FIGURE'14.5
S-shaped
curves.

This equation is nonlinear in the parameters(*, 4, and r) and thereforemust be
estimatedusing a nonlinear estimationprocedure.While this can add computa_
tional expense,there are some casesin which it is worth it. As shown i; Fis.
f4.5, Eq. (I4.I l) representsan S-shapedcuwe which might be usedro ,.pr"r.i
the salesof a product that will somedaysaturatethe market (so that the total
stock of the good in circulation will approachsome plateau, or, equivalently,
Other S-shapedcuryescan be usedin addition to the logisticcurve. One very
simple function with an S shape that can be used to model salcssarurauon
pattems is given by

January
February
L4arch
April
May
June
JU ry
August
September
October
November
December

'1968

'1969

1970

1971

1972

1373

1974

2,582
2,621
2,690
2,635
2,676
2,'714
2,834
2,789
2,768
2,785
2, 886
2,842

2,839
2,876
2,881
2,967
2,944
2,939
3,014
3,031
2,995
2,998
3, 012
3,031

3,034

3,287

3,045
3,066
3,077
3,046
3,094
3, 053
3,071
3,186
3, 167
3,230

3,336
3,427
3,413
3,503
3,472
3, 511
3,618
3,554
3, 641
3,607

3,578
3,650
3,664
3,643
3,838
3,792
3,899
3,845
4,007
4,092
3,937
4,008

4,121
4,233
4,439
4,167
4,326
4,329
4,423
4,351
4,406
4,357
4,485
4,445

4,456
4,436
4,699

one might wish to forecasthonthly salesfor April, May, and the months
following in 1974. For this example,we extrapolatesalesfor Apri'l 1974 Thc
resultsoi four relressionsassociatedwith four of the trend modelsdescribcd
aboveare listed below. standard regressionstatisticsare shown with / statis
tics in parentheses:
Linear ftend model:

(t4.r2l
Note that if we take the logarithmsof both sides,we have an eouationlinear in
the parametersa and p that can be estimatedusing ordinary liast squares:
k)
togyt: kt - -

(14.13)

This curve is also shown in Fig. I4.5. Note that it beginsat the origin and rises
more steeplythan the logistic curve.
' The followl'rrg approxirnalro,to the logistic cuwe caII be estimated using ordinary least squares:

!=

R2: .955

F(r/B\ : 1,557

s : 126.9

DW : '38

Logarithmic linear trend model (exponentialgrowth):

( r 4 . l5

= 7.849+ 'O077t
log SALEST
(1,000) l.52.61
R2= .974

F(l/73\ : 2,75O

s : .027

DW = .56

Autoregressivetrend model:

n - n r,- ,

The parameter c2should always be lessthan I and would ti?ically be in the viciniry of.O5 ro .5. This
eqnation isa discrete-time approximation to the di{ferentiil equation dy/dt = c,y1q - y), and the
roturlorto this differentialequationhas the folm ofEq. (l4.fli.

( r4.l4)

SALESI= 2,46).1 + 26.74t
(84.e) (3e.5)

SAIEST: 4.918 + 1.007sAlEs,-r
( 0e) (65.05)
R2: .98J

Fll/721 : J,829

s = 78.07

(14.i 6
DW = 2.82

Logarithmicautorcgrcssivc
trcn(l nro(lcl:

vnlllcli)I Al)ril 1974ol 4'7 l(r'll:
Drttc css,
vit'lrls,lll rxlr,ll)olrllc(l

= .0188+ .9987log SALES/_
log SALEST
r
(.r6) i.70.)7)
R2: .985

F(r/721: 4,524

r = .021

(t4.t7)

DW = 2.80

In thefirstregression,
a timevariablerunningfromo to 74 wasconstructe(j
andthenusedastheindependent
variable.Whln r'=-z:"i, pl..itr,,ir. ,tgf,,_
hand sideof the equation
SALES: 2,463.t+ 26.70t

(14.l8)

the resulting forecastis 4,465.g.The use of the
secondlog_linearequauon
yields a forecasrof 4,j5t.j. The third regression,
brr";;;;;;;;;.egressrve
FIGURE
14,6

4.7\6'lt - 4'92 1 1007 x 4'699

(hc cxtrJlx)latc(ivnltl('
If the constanttcnrl wclc tlrop|ctl lrom Eq (14 17)'
on thc logarillltrri(
ba:ed
is
result
would be 4,7)8.24.Thc lirurth rcgression
is 4'715 6 ll orrc
case
this
value in
model.The extrapolated
autoregressive
to extrapolnl('
and
series
were tJ calculatea compoundedgrowth rate for the
valtr('
extrapolatcd
the
on ttre baslsthat the growth rateiemains unchanged'
would be 4,719.3.
Thesimulatedandactualsedesareplottedfoleachofthefoulextrap
figure that thc tw()
tiot-t*oaatt in Fig. I4.6a and & One can seefrom the
end of the pcriorl'
at
the
u.. closerto the actual sedes
urriot"gr.rdu.
the data li)f
extrapolate
-o'dal,
tend modelscould be usedto
other
of;;;;
on
based
forecast
example, the reader miSht try to calculatea
fiend model (seeExercise14.-,'

Simulatedand actoalsales.

precedingexanrltlt'
Simple extrapolationmethods such as those used in the
ofvariablesrang*a ft.i".",iy tna basisfor making casuallorg-range forecasts
can be usclitl as
they
ing from GNi to population to poilution indices Although
little fbrccastprovide
i'"v oi q"i.nv i.rmulating initial forecasts,they usually
is at least arlmodel
";;;;;.y.
T'heanalyst who estimatesan extrapolation
intcrval
confidence
viiea to calilrtate a stindard error of forecastand forecast
shotrl(
one
important'
following the methods presentedin Chapter 8 More,
willl
forecasts
obtain
to
realizethat there are alternativemodelsthat canbe used
smallerstandarderrors.
I
t9l 1

l
i 975

14.1.2 Moving Average Models
consi\l\
Another classof deterministicmodelsthat are often usedfor forecasting
forecastill!
we
are
that
assume
example,
'r
of movingaveragemodels.As a simple
monthly time series.We might use the model
f (tl : ilY, t 'r !, z t
'
<'!

!r*r:
l

1914

( l4. le)

Then, a forecastone period aheadwould be given by

tosui I hn rc' autaresressive

l
1913

' 't !'-nl

I

r975

.
lz(y, t !r,,, -l

* Yr-rr)

(r4.20

value for ottr
The moving averagemodel is useful if we believethat a likely
months ll
past
t2
the
over
its
values
of
seriesnext m6ntn is a simple average
be givct
ofy'
would
forecast
a
that
Sood
may be unrealistic,however, to assume

by a simple averageof its past values.It

ls often more

rcasonableto have morc
u,ut":r.9ry,
playa greater,role
,h"" .;di., ;;t,r;r. ;il;
Iljlo.
caserecenr
values
shouldbeweighred
moreheavily.in
"
rhe;ou;;;;#gel'a simpte
mod"t
trat
?ccomplishes this is the exponefttially
weighted-moviniavirage (EWMAI

As an cxalnltlc,(\rttsltlcra ltrrcc.li(lwo pcrlodsahcad(/ - 2), whlch wottld bc
glven by
lttt=

+
.l+a(l-a)yr
= alayT t s(l - alYr-t +

!r+r = alr + d.(l - a)yr_\ I a(I - al2yr_z -f
:o l tt--t' ,,-

* c(l - c)'zy1-1* '

(14.2r)

Herea is a number betweenOand
I that indicateshow heavily we weight
recent
valuesrelative ro otder ones.wirh
a = l, fo. .";;i";;;;?orL*
u".o_.,
!r+t = lr

(14.22)

valuesof/ tha,toccuftedbeforey1. As
a becomessma er, we
litj:^"_1ry:t"
pracegreateremphasis
-y
on more r stantvaluesofl Notethat
Eq (14'2r) representsa true average,
since
s r ._
a z \t-4 )r= ------:-_ = l
r= o
r - (r - a)

(r4.23)

so that the.weights indeed sum
ro uruty.
s'erieshas an-upward(downward)
fend,
the EWMA model will undemredictjoverpredia)
fuiure
val'ue-s
y,.
of
This
will
indeed be rhe case, since the moo'.e_t
averages;;;;r;i;to
produce a
rorecast.
Ify, hasbeen

*,,,
rnusbesmaq ttan rhemosrrecenrvaly"y.,
j"a
,o g.o,v
steadilyin rhefuture,l.* , will bean,"a.riri.,u.li.fliir ri.l..i.iloii*,r".
to removeany rend ftom the dataU"t".. ,rr.'i-.li.r" yr* , . rt u,
:T
:uqh,
rt JnwMA tech_
rnque.onceanuntrendediniti"t forecast
".i"g
has
be;;il,
il:
ili.oa
,.r_
u"
added to obtain a ffnal forecast.
"".,

#,I:.[t';:,frffi;,fJ;T:111'

more.than
oneperiod
using
an

jH:::.31\$.k[:trj#:":fdi;!:,^i:]::,
i,,ir,,a.."*.igr,t.i;;;;#;ili
.

, !r*t. Thislogicalixrensionottrr. swrvre
-#i
)/rtt = qir+t_r. I

all - aljTt;2

-t

r-o

,,
,\.r
( t - ql, lr , + a( r - 4) z

r-0

(r4.25

Note that the two-period forecastis the same as the one-period forecast Tltc
, in the EWMA model are the sameas they wcrc
weightings on!r, lr-r,.
befJre, but we are now extrapolating the averageahead an extra period ln fact,
it is not difficult to show (seeExerciseI4.4) that the /-period forecastfa11is alst
givenby Eq. (14.25).
fhe moving average forecastsrepresentedby Eqs. (14'20), (14'21), and
By "adaptive" we mean that they-automaticall
idiust themselvesto the most recently availabledata. Consider,for example,a
simplefour-period moving average.Supposey26in Fig' I4.7 representsd:remost
recent data point. Then our forecast will be given by
ln=

+ Ye+ Yft+ Yvl
L(Yzo

gzz= I(gr * yzot yr"+ yra)= i?yzo* t/rs * */rs + tyu

FIGURE
14.7

t,u I

I

&*;,,

i,, ",,

alt+ty7_2
G4.241

lL4.26

and a forecast two periods ahead will be given by

iJ

'+ a(l - art-2tr+l

\ t - qt f r . ,

=d Z\t- d l '!r - t

i

;;;Hd,

+ d(l - qrt-tyr I a(I - altyT_1
+ a(L 'r q.(l - alt+2yr_1,* .

= o" L

(14.27

Theselorecastsarc rcprcscntcdby crosscsin [,1g.14.7.Il
.y21wcrc known, wc
izr:

.1,- l\.Y,'zl Y,,tl

tr(yt + y2o+ y)e + yB)

This forecastis representedby a circledcrossin Fig. I4.7. Now supposethat
thc
actualvalte of/2r turns out to be larger than the predictedvalue, r.e.,
lzr ) lzr

Smoothingtechniquesprovide a meansof removingor at leastreducing
volatile
short-termfluctuationsin a time series.This can be-usefulsinceit is
olten easier
to discern fiends and cyclical pattems and otherwise visualy
analyze a
smoothedseries.
adjustmentis a specialform of smoothing;rr removes
-seasonal
seasonal(cyclical)
oscillationsfrom the seriesrather than irregular short_term
fluctuations.

Smoothing

Techniques

In the last sectionwe discussedmoving averagemodels (simpleand exponentially weighted) in the context of forecasting,but these models also provide a
basisfor smoothingtime series.For example,one of the simplestways to smooth
a senes is to take an n-period moving average.DenotJngthe originai seriesby y,
and the smoothedseriesby i, we have

-

t.

'r

h-n+ r )

( 14.28)

Of course,the larger the z the smootherthe y, will be. One problem with this
moving averageis that it usesonly /asl (and current) valuesofy, to obtain each

lt4.2el

J;:";;;;;;i:1':*l'::
:tT!if,Jffii:,\$il'*:#;i:1,:;l:l'weights to recent values oI /r') r I
-

ct)y1-1 'l a(l

- a\2Ytz I

( r4 . 3 0 )

thc
all t{It *1
l1:l 'ntough
where Ie summationin Eq (14'30) 9xt9nds.
wrirc
if
we
easilv
more
much
rn fact,i,can be calculated
lenil ;fu;;;t.
( r 4 . 1 I)
I a( l - a\ 2y1- 2*
=
oly
( I - a) . i, - ,

a( l -

-t

(14 30) we obtain a recursiveformula
Now subtractingEq. (Ia'31) ftom Eq'
the comPutationof f':
h=

dlt t

( l - a) l', t

for

(r4.12)

Notethatthecloser@iStolthemoreheavilythecurrentvalueof/lisweigh
serics'
of a implv a more heavilv smoothed
,-uut'u"t''"'
;;#;;;:il,ri
much
very
not
but
"' tt;;;il;Jilmight
a
series
wish to heavily smooth
-give
a
small
with
Fq'
(14'32)
of
use
the
weighr to past data points ln t;;"-J"tt
dorblt
aPply
one..can
value ofc (say.l) would not ue atceprubtt'
singly smoothedsedesi! fronr
smoothing.a, tr,tenam. i-plies, the
exponential
E9.(14321is just smoothedagain:

it:

a!'r (I - a)fr-r

( r 4.3r

and the resultingseriesi' will still bc
In this way a largervalue of a can be used'
heavilv smoothed.
*ffi
be modismoolhingformula of E+ (-1412)ranralso
til;;;onential
ttr
in dre.long-run trend lsecularincreasc
fied by incorporating
smoolh
exponenti'l
^'"'ug' "o'g'i s for H,lt's two-parameter
decline)of the series.rnis is tne.oa'si

smoothed;:ru.t;:,fl'l'#:;':'ff i,f."H::i'il:
the
i,s;;,h"a'Now
:
anddepends
9".ry" :-9:TlT"ifi;;;^ilfie'heavier
between 0 and I (again,the sm
9t = alt I

h :;U r+ l tt' l

Y''r Y' t* Y'-z\

the exponc'ntiallywciShlc(l
smoothingsimply involvesthe use of
Exponential

jt = slt * a(I

The actual valte of y22is, of course,not known, but \ryewould expectthat
r2
would provide a betterforecastthanf22 becauseof the exfia information usedin
behavior.
Although the moving averagemodels describedabove are certainly useful,
,
they do not provide us with information about
forccastconf.dence.The reason is
that no regressionis used to estimatethe model, so thai we cannot calculate
standarderrors,nor can we describeor explain the stochastic(or unexplained)
component of the time series.It is this stochasticcomponent that createsthe
enol in our forecast.Unlessthe stochasticcomponenti; explainedthrough
the
modeling process,little can be said about the kinds of forecist errorsthathight
be expected.

14.2.1

dv(r(41(
by ttsing.ace,ntercd,novlkfl
valuc ol t,.'f'ltfs lrrrrlrlctttlr catlly lenlcdled
l)y
glvel)
ls
livc'pet.kxlccllteicdmovlngavcragc
i,"t.-tiitr",,r

h=t(fu-

thesmoothins

( l - d) ( it - r + r r - r )

\t4.J4l

l1 - :ilrFl

(14.)5

i-ll+

Avera

",,"tioi\$1!."1?f',\$'J.::T#1|*\$ff:Ti,X1,'J#':ff-qJ'ffif:ied

Here rr is a slnoothcd scrics rcprcscnlitUllllc lrcn(|,
i,c,, ,tvcr.tgcralc ot

illcrcasc,
in rhe smoothedseriesi,. This trendis atldcdi,,r
*ir.u .i,nipuli,f tn" ,nluu,1.,",i
i: Eq. (1a.la), therebypreventing from Oeuiating
l
corii,;e.abiy{!onr
::l.t.t
recent
valuesof the odginal seriesy,. This is pani."tu.ty
ure"t t iiit.,e smoothing
method is going to be usedas a basisfor forecasting.
Ari f_p..ioJfo...u* .un 0.,
generated
from Eqs.(14.34)and (14.35)using
lrt

: !7 1 lr7

3.:ltl". r4.?qonlhry

nousing startsin the United states
u gooa a*u*pl. for the application of smoothing and seasonatuajrrt_.r.,t-,_,r'.tiid;;;;;;",
-provides
flucruares
considerablyand also exhibits strong seasonal
smooth the series usinq rhe
_.! movrng averageand exponential
smoothing
methods_
we,begin by using three- and seven-period
centeredmoving averagesto
smo^oththe series;i.e., we generatethe smoothed
seriesy, from the original
senes/r using

t' = iZ!r+tttzt,-t)-i

{14.}7)

wnere n : 3 or 7. Note that sincethe moving
averageis centered,there is no
need to detrend the seriesbeforesmoothinf
ir. T#;;;gt;;i;;ries,
together
with rhe rwo smoothedseries,is shown in
r-ig. r+.8. d"r;;1;"
the use of
t For a detailed treatment
of some other smoothrnglechniques,see C. W. J. cranger
and p.
Newbold, ForecaslingE.onomicTlhe Series
and,s
Makridakii

*n",tu^sn. iii;,L,;s";;'r;;:;;;;;;i::,";:iii#,,i",il,.*ll'":r.

-- --....,.,

| pal l l rdmov l nl l ral al !
7 pc rl t mov l nl .l Gtl l a

(r4.36)

Thus the 1-periodforecasttakesthe most recent
an expectedincrease/r1 basedon the (smoothed)
to"g--., i.".rA. (If the data
have been detrended,the trend should be ,dd.d
;;;;;;;;;."*.)
Smoothing merhods tend to be ad hoc, particularly
*;;;
;;y are used to
generateforecasts.One problem is that
wi ha,re no'way of a"t.._i.r,.rg tta
"co[ect" values of the smoothing parameters,
so trat theii cholce becomes
arbitrary. If our objectiveis simply to smooth
;" ,;;;., to make it
11ewha1
easierto interprel or analyze,then this is not really
u p.obt"_, stncewe can
choosethe smoothingparametersto give us the
extentorsmooitring
^ai;.1i1 oestea. we
must be careful,however. when using an equurio"
[k; E;.
,0, ror..u*ing and recognizethat the resulting forecast
will be somewiratarbitrary.5

-

rl |.l l l i

-_i rtl l

ands i

" Ine oflBnal dala seriesis in lhou(ands
of unils per month and is zor seasonallv

14.8
FIGURE
usingrfovlngaverages
Smoothing

the seriesand evenelinrithe seven-periodmoving averageheavily smoothes
*natessome of the seasonalvariation
we apply Eq
use the exponendal smoothing method' i ew" ;;*
and the exponentially
(r+.i). iinc. tt. originalseriesis growingovertime
serieswill underestiweightedmoving ar,"rug. tr,to' t""ttred' the smoothed
the series To detrend rhc
ma; the originil seriesunless we first detrend
test alternativc
t.tl.i *. assumeda linear trend 1wecould of course
"rini""i
time rrends).and ran the regression
y:

-156'81 + I.208ll
( - 1 .3 6 )

R'z: 360

(r4.38)

(5 37)

= yr + 15681 - l2o8)t'
that is'
u, from this regression,
The residuals
'r
provide the deffendedseries.
We usetwo
"^ w.
axponentialsmoothingto this detrendedseries
^ppfv of the smoothing pu.i-.t"r, a = .8 (light smoothing)arl(l
""t, ,"r".,
seriesti'
"rt.."",-i*
lngl' rinalllwe take the smoothed-detrended
; f .;l;;il;oott
: th - 156'81+ 2o8)t'
.1
u.raiad tn" irend backin; i.e.,we computei t
*--in"
are
shown in Fig l4'9' obseruc
series
origi.tuft".ies and the smoothed
reduced'are pushed fort o- itr. frgr.. that the seasonalvariations' while
the exponentiallv
because
This
occur-s
;;;; ;y h;"y ..ponential smoothing
strongseashows
series
if
a
Thus
weighted moving averageIs nor centered
smoothins should be used onlv after the serics
;;;:i;;;;;:"xponintial

1 52 2
139.1

o r Sin il \fr iljs
- - - - | Sh r ly sm o o th e d( 0 = 8 )
"""' h e a vily

w hcl c /, "
S=
c =
1=

sm o o r h e d{ a = 2 )

127

vt r lt t t 'r t l llt t ' lt t t t g'lt 'r t ltscct t lart t et t t l it t scr lt 's
onclll
l
valt t c t t l sc, t st t lt , tcollll)
(lotll.l-lcrlll) cycllcal trrntpotrcttt
ir lcgt llar coll) lx) llclll

The objective is to eliminatc thc seasonalcomponcnt S
alr(l cycli( nl
fo do this we first try to isolate the combined long-term trcnd
sll)oollllllll
4'l
an
exactly;
components t x C. This cannot be done
'o'
scasotral
atttl
the
combincd
possible)
(as
as
much
proa"drra is used to remove
lll'll
stlppos('
y,
For
examplc'
series
irregular componentss x 1 from the original
is
compulc(t:
average
it
y, consists of monthly data Then a l2'month
i, :

t \ \ y, *a + .

. +! , *! , t +'

'+lr s)

( 14. 40)

and irregularfluctuationsand is llttr\
is relativelyfreeof seasonal
Presumablyy,
an estimateof L \ C.
nt)
we now divide the original data by this estimateof I x C to obtairr
x
1:
s
components
irregular
and
estimateof the combined seasonal

l,xsxcxl
LXC

FIGURE
14.9
Smoothingusing exponentially
weightedmovingaverages

14.2.2

sonalindices(that arremprro measure
the siasonal ,".i;;;;
ihe sertesland
then usins those indicei to deseyo::ti:: (i
.., ,;;;.;;il';;rrti
tr,. s..i., uy
removingthoseseasonalvariarions.National
economicd;;;|
united states

by thecensus
n;;;;;; i;;;;^;;
l:.j:1"]ll developed
u, varianrs).
by
the
Bureau
of
rhe
census
.f
;"
;.;.;"p".rment
il1.y:r
of
commerce. The

CensusII method,is a rather detailedu.ra.o_plr.u,.a
pro."_
is amazinsly ad hoc), andwetr,"."ro." *irl
noi;,;;i.
describeir
here.,
we discussthe basicidea that lies befrf"J
methods (includins census rr) and present
"jir.lr"ll"f ^djustment
i., *u.ry
" ";t;;;;;;thut
rechniquesare basedon the idea thar a time senes/,
.^
can
"{i*qent
De*:::lt
represented
as the product of four components:
L:LxSxCxI

""rllff":::#;11,:il?"J.is

114.)9)

described
in detailinL satzma,.
couputerized
EcoomicAnatysis
lNew

sxr = 4 ,= 2 ,

1t4.4| |

The next step is to eliminate the irregular component l as complctcly 'ls
thevalutstrl
possiblein ordei to obtain the seasonalindex. To do tl,is' we average
'S
(and hcttr'c
yr
that
suppose
words,
to thesamemonth.In other
x I coftesponding
of dald
48
months
are
there
and
z1) correspondsto January,y2to February,etc',
We thus compute
Zt=i(zrlzo*zzslzvl
Zz: i(zt + 761 226* zxl

t1A l )\

Zo: I(zn * 7ra* zY I zaal

Thelationalehereisthatwhentheseasonal-irregularpercentagesaal
fluctttagedfor eachmonth (eachquarterif the data are quarterly)' the irregular
out.
ations will be largely smoothed
, Zl2will then be estimatesof the seasonalindiccs
The 12 averages4, .
They should s,rti-tclote to 12 but will not do so exactly if there is any long-rlttl
indiccs
.,"nd i,'.1,. du'u. Final seasonalindicesare computedby multiplying the
if
(For
example'
t2
4'
sumto
their
"',
inEq. (14.42)bya factorthat brings
will
indices
revised
that
the
7
so
l2.o/lI
ZD udd to f L7, multiply each one by
7n
by
indices
7,
seasonal
'
add to tz.1 We denote theseflnal
'
jtrs
The deseasonalizationof the original series/r is now straightforward;
there[)
divide each value in the seriesby its correspondingseasonalindex'
component
three
other
the
leaving
while
removing the seasonalcomponent

v'i = vt/rt,.v'i = .v)/rt,
. . . . liz = !tz/2r2,lit:
r,,,/2,t,";,i:i:_tyr,::l;inetr.r'ronr

-

ron,n,,

liilr:_lls.techniqueto our seri-esfo^r
monthlyhousi.rg.ili; lree nra_pt"
14.2).To do thiswe fusrcomputea tz-month
iveragei]of tt e J.igi.rut,e.i",
Eq {14.40)and thendivide/.,byi, rhat,r;ffi;;;;;:
yy'r,. Note
that zrconrains(roughly)tt e rearotrit urra
-Li."ls
i.."guru;.o;ffi;n1s of the origr_
We remove,n. t...r::,f .g-Oonent by averaging
l"]-r:l"t
the vatuesof z,
that conespondto the sameml
t"T\"\: ?'' z''
Eq. ra.a2l.we rhencompurc"'1T:t-1,'-li
' 212using

1t4.1
102.2

(
I
I

89.8

;";; j,i.T
-i.r,ipryG,r,"z,l':'.'.,i"i,rT'-,:l#'f::XT::;:'tr:,i',;fi
seasonalindices are as follows:

64.8

FIGURE14.10
Housing
startstseasonal
indices.

52.3

t.2609
39.8
r.1 82 5

FIGU RE'I 4, 11
ol housingstartsdata.

Indices
Seasonal

January
February
l,,1arch
April
May
June

( e) ( t 0) ( ll ) ( 1 2 )

.5552
.7229
.9996
1. 1951
1. 2562
1.2609

July
August
September
October
November
December

1. 1900
1.1454
1.0675
1.0823
.8643
.6584

Theseseasonalindiceshave been plotted in Fig. 14 10.
the original seriesy,we just divide eachvalue in the series
To deseasonalize
by its correspondingseasonalindex, thereby removing the seasonalcomponent. The odginal series/r togetherwith the seasonallyadiustedseriesyf are
shown in Fig. 14.I l. Observethat the seasonalvariation has been eliminated
in the adjustedseries,while the long-run trend and short-run irregularfluctuations remain.

EXERCISES
l4.l Go backto Example14.l and use the
datafor monthlydepartmenr
srorcsatcsto
estimatedmodel to obtain an extrapotatcrl
valueforsalesfor April 1974.Try to evatuut. you.
.noa.ii,i.o.ipliirin
a ,n. o,r,., fn",
estimared
in Example
r4.r. andixolainh"* il;;;;;;;;;J"ii?o',
iir,, o,rr.,,rro",
the other forecastsin the examole.
14,2 which (if any) of &e simoleextrapolation
modelspresentedin section I 4. I do you
think might be suitablefor foiecastingthe
GNp? The consumer price Index? A shortrerm lnlerest.rate?Annual produclion of wheat?
Explain.
.rq., )now that the exponendallyweighted
moving iverage (EWMA) model will g€ner_
*" *. EWMA forecast / pedods aheadis the
sameasthe forecasron€ period
lno.n"Of
lll

c)

l\ - a)'yr-,

14.5 Monthly datafor the Standard6.poor
500 CommonStockpriceIndex are shown in
Tabl: 11.t. The data are also plorted in
Fig. 14.12.
(a) Using all bur the last three dara
lolnts Iie., April, May, and June of 1988),
exponentiallysmooth the data using a value
of .9 for rhe ,_oorfrirrg-ju.urn., er a. Hmt:

average
iialwuy,,r,o"..,r,J.,r,.;;;',;iJ#.r. Repedr
fora
ffT:Tro:j.*r " -oving
(r) Again using all but *l€ last three
data points, smooth the data using Holt,s
two_
parameter exponential smoothine method.
Sei c = .2 and 7 : .;. ;;;;;
how and why
the resulrsdiffer from thosein (aiabove.
No_ ,r. rq. fi+.l.of ,"-io"##rhe seriesout r,
2, and I months. How closeis your forecast
to the actualyaluesofthe S6p 500 index for
April to June 1988?
14,6 Monthly data for retail auto salesare
shown in Table 14,2 on page416. The data
are also ploued in Fig. l4.lt.
(4) Usea 6-month centeredmovi:
seasonar
pattem

widentz
wourdyouil;;;;;f,r;":f;fi1",:H:*
(r)

TABLE14.1
IiIOOKl'ltlCLINDLX
& I'(X)ll In)0( lOM[/4oN
STANDARD

1979.01
1979.07
1980.0 1
1e8007
1981.01
1981.0 7
1982.01
1982.07
1983.01
1983.07
1984.01
1984.07
1985.01
1985.07
1986.01
1986.07
1987.01
1987.07
1988.01

98 23
99.71
10271 10736
110. 87 115. 34
119.83 123.50
132.97 124.40
129. 13 129. 63
117.2A 114.50
109.38 109.65
144.27 146.80
166.96 162.42
166.39 157.25
151.08 164.42
17',1.61 180.88
192.54 188.31
208.19 219.37
2401A 245.00
264.51 280.93
310.09 329.36
250.48 258.13

10011
10860
104.69
126.51
133.19
118.27
110. 84
122.43
151. 88
167. 16
157.44
166.11
179.42
184. 06
232.93
238.27
|
292.4
318.66
265.74

102.07
104.47
102.97
1302?
134.43
11980
116. 31
132.66
157. 71
167. 65
157.60
1e/.82
180.62
186. 18
237.9A
237.36
2a932
280.16
26261

FSPCOfi,l
Series
Citibase,
Sourcer

14. 12
FIGU RE
Standard& Poor500ComrnonStockPricelndex'

i::irsa

Usingthe originaldatain Table14.2,applythe ff
seasonal
procedure
described
in rhe rext.plor rhe 12 fina-lseasonal'indices
fu.roid oi'ii_. andtry to
rhecurve
", "
plottleseasonartv
Also
aa;us,J,ii.,l,'i

.o-p",.i ,o

;i:Ilil,ff

":H:f

50

1982

1983

1984

99 73
103.66
10769
13565
131.73
12292
116. 35
138.10
164. 10
165. 23
15655
t6627
18490
19/45
238.46
245.09
2a9.12
245.01
25612

10173
107.78
11455
13346
13228
12379
10970
13937
16639
164. 36
15312
16448
18889
20726
24530
24861
30138
24096
2/0 68

TABLE'I4.2
RETAIL
AUTOSALES
(thousands
ot units)

1979.01
1979.07
1980.01
1980.07
1981.01
1981.07
1982.01
1982.07
1983.01
'1983,07
1984.01
1984.07
1985.01
1985.07
1986.01
1986.07
1987.01
1987.07
1988.01

774.00
832.00 1,104.00 976.00 1,042.00 894.00
876.00
908.00
767.00 892.00
768.00
726.00
806.00
a12.OO 895.00 743.00
697.00
702.00
773.00
686.00
672.00 848.00
698.00
649.00
648.00
764.00
963.00 751.00
734.00
724.00
707.OO 801.00
687.00 649.00
585.00
523.00
535.00
632.00
777.00 669.00
774.00
651.00
630.00
609.00
671.00 656.00
743.OO 632.00
596.00
628.00
821.00 762.00
837.00
904.00
792.00
741.00
705.00 861.00
7A2.OO 752.00
778.00
841.00
964.00 896.00 1,047.00 958.00
890.00
814.00
744.00 900.00
802.00
759.00
835.00
839.00
970.00 988.00 '1,075.00 925.00
899.00 1,001.00 1,068.00 864,00
762.00
812.00
870.00
832.00
897.00 972.00 1,072.00 1,001.00
954.00
952.00 1,217.00 906.00
783.00
992.00
626,00
781.00
936.00 938.00
887.00
943.00
913.00
968.00
905.00 802.00
737.00
843.00
765.00
888.00 t,006.00 901.00
974.00 1,010.00

SourcerCitibase, Series RCAR6T.

-

RO
IJ

n5
5
s0l?s

ul-

tr5

CHAPTER
L

)

PROPERTIES
OF STOCHASTIC
TIME SERIES

In the last chapterwe discusseda number of simpleextrapolationtechniques.In
this chapterwe begin our treatment of the constructionand use of time_series
models.Suchmodelsprovide a more sophisricated
method of extrapolatingtime
series,in that they are basedon the notion that the seriesto be forecastedhas
been generatedby a stochastic
(or randomlprocess,
with a structurethat can be
characterizedand described.In other words, a time-sedesmodel provides a
descriptionof the random nature of the (stochastic)processthat generatedthe
sampleof observationsunder study. The descriptionis given not in terms of a
cause-and-effect
relationship(aswould be the casein a regressionmodel) but in
terms of how that randomnessis embodiedin the process.
This chapter begins with an introduction to the nature of stochastictime_
seriesmodelsand showshow thosemodelscharacterizethe stochasucsrrucrure
of the underlying processthat generatedthe particular series.The chapterthen
turns to the propefties of stochastictime series,focusing on the concept of
stationarity.This materialis important for the discussionof model constructionin
the following chapters.We next presenta statisticaltest (the Dickey_Fullertest)
for stationarity. Finally, we disc\ss co-integrateltime series_serieswhich are
nonstationarybut can be combined to form a stationaryseries.
I'.I
INTRODUCTION
MODELS

TO STOCHASTIC

TIIVIB-SERIES

The time-seriesmodelsdevelopedin this and the following chaptersare all based
on an important assumption-dlat the seriesto be forecastedhas been gener_
ated by a stochastic
process.
In other words, we assumethat each valueyr, yr,
440

ltl
probablllty,(ll\$trll)tlliorr'
. . ,.yt itr lllc seli('sls tltawtt tatttlotttlylrottt a.
tattlls
(llnLlrl('li\tl(s
0l
(lcscritrc
lhc
modclillSsttcha ltltlcess,wc illlctllIl to
asstttlatxrtttlltc Prrtlr;tbilitics
,lumn"ri. 'l'hisshoulclhclp us ttt irrlct sontcthirtl;
atedwith altcrnativcllttlrc valllcsol thc scrics'
ot)scrvcdscricsv1'
'
To be completclygcncral,wc couldassumcthal thc
llwccor ltl stttttt'ltrtw
variables
randofi
iti- u sii of iointtydistrifuted
scrics'lll('rr w('
'.r,-r*..i.uttv
".itit-u*"
specifythe probabilitydistributionfunction.li'r Itr'rr
ortlrolll("
rtr
anothL'r.luttlre
actuiuy determinethe probabilityOf onc
could
-lrI I r(
dislribrrtiorr
probability
the
of
UJonu.ruiaty,the completespecification
t()
is
usually
it
t)os5il)l('
Howevcr,
time seriesis usuallyimpossible.
ti.;l;;;
itt
its
ratrdexplains
which
constr.,cta simptifledmodel of the time series
[rt'lit'vc
n.rigltt
wc
example,
For
purposes.
u-*u.r.ra, trrut i, useful for forecasting
witlt
, y, are normallydistributedand are corrclalctl
that the valuesof yr,
n(ltl'll
Thc
process
autoregressive
to a simple first-order
.i.irlrrt"i"l."taing
model may bc a l('is()rr'
distributionmightbe more complicated,but this simple
a
model depcndsort ltow
of
such
uUtaupprorit ri,ion. of course,the usefulness
truc rantl'ttr
and.thus.the
;b;eiii; ."o,"t"t d:retrue probability distdbution
marclr llrc
nor)
will
(and
usually
n,t
oiit. ,.ri"r. Nore drat it n;ed
il;;i";
tl
stochastit
are
model
the
and
series
the
uehavior of the sedessince
;;;-;;t;
randomness
series'
the
of
rft"rlA ti-pfy capture the characteristics
Random Walks
seriesis the randomwulk
Our first (and simplest)example of a stochastictime
charylein y1is
pt*art.; i" irt" timplest random walk process'each succe^ssive
'Arawn independentlyfrom a probability distribution with O mean Thus' yr is
determined bY
l5.l.l

\=

lrr

t

( 1 5 r. )

Et

:0fort+
s Sucha processcouldbe generatcdl)y
with E(e,) = O andE(ere,)
+ and a tail rccciv('s
,"-"tt#
a value of - l.
-';;;;t"
walk process'lltc
;" *anted to make a forecastfor sucha random
forecastis given bY
!r*t= E\!r*rlYr"
er+ris independent oIyr- t' '
But/r+r:
lrI
! r +t :

(r5.2)

"!r)
'yI'

=
lr * E( e7a1l 17

Thus' the forecasl tttrt'

( 15.1)

lTherandomwalkprocesshasoftenbeenusedasamodelforthemovemen!ofst
Anulvtt
r. r. ru-", "na"aot" wurrtt in srockMarketrri.ces"'Fi'llhcial
jourfial,
"rt..;.;;'f;;^;;;;lJ,
1965'
septerr'bet'october

' llr ( ' I or c ( a, i l

rw (r l x .ri rx l \.l l t(..1 (li \
!r* z =

E (y rn z )y r,.

.,!r)

= E l l r+ t + er+ 21

= E(Yr + er+t * er+z) =
lr

(r 5 . 4 )

Similarly, the
_forecast
Although-the, forecastfr*1 will be the ,u-" .ro
au,tar how large / is, thc
varianceof the forecasterror will grow as /
become,lurg.i. Fo. th. one_period
forecast,the forecasterror is given by
€t :

lr + t

: lr

*

- lr + r
ar+t - lr

= Er+t

(r5 . 5 )

,| .1|( . I nf |
is. ||l i||||'t ) |1, ||||
| ||||s
ll|. |l w( . ( '. ||l l. |( . I t ( . |. 1|( . t r t t t I it |t . t t t . t 'it t t ctr r. v.
llr |( ) |. |
( llt 'll) l( 'It l' lloli( y
ill
As
wt
't
'xpl't
it
t
ul
t
t
t
ot
lcls.
llnsli( iit llc- sclicli
ati vat t lagt 't t slot
l
makcr s'nqc( |l( ) kI l( ) wl||ct t t algit t t t |'cI I ( ) r lllalI t . r slr t lr t . asst t t . i. t lt r I wi||t . t
parti cuI aI I i) I ccaSt , s( ) clr t llit |ct t cciI lt cr valsCaI r [ ) casillr l) ( ) r l. ll) 1. r slI t t . |it t , r . t '
thcmselvcs.
v( 'is lll( ' r 'r r l
l
A sim plc cxt ellsion ol t hc r andom walk pr occssdisct lssc(alx)
(lowr
lw'lr( | ) llr
(tlpward
oI
for
a
trcnd
proccss
accounts
This
dom walk with drift.
lll llli\
li)r('(i)sl
ill
outtrcrld
that
to
cmbody
us
allows
the sedesJ./rand thereby
bY
process,.r/ris determined
( I ', . 8)
yt =yLt +d+er

>
Nolv
so that on the averagethe processwill tend to move upward (for d 0)
the one-periodforecastis

and its varianceis just E(€?*r : oj. For the rwo_period
)
forecast,
oz: !r+z - !r+z
: lr I er+t I ernz lr:

|r*t:

',lrl=

Yr+ a

(ls.tr1

and the /-period forecastis
Er+r-r er+2

(r 5 . 6 )

and its varianceis
E[(e7a1
* er+z)2]: E(ei+tl + E(ti+2) + 2E(e7ap71l

it t = yt + Id

(15.ro)

peri(xl'
The standarderror of forecastwill be the sameas before For one
(15.71

€r1r and e?+2.
are independent,the third term in Eq. (15.7) rs
:ince
0 and the
error
varianceis 2oj. similarly, for the 1-periodforecast,
tt J ei.o. uuriun.. is ld.
Thus.,the standarderror offoreiast incr"ur", *i*,
tfr" ,qulr"l"",
.fj we can thus
obtainconfidence
intervahfor our forecasts,
and theseintervat, witiL.co-e *10..
as the forecasthorizon increases.This is illustrated
i" fig.-i i.i. l,Iot. tt ut tt .
forecastsare all equal to the last obsewado" y.,
Uui,fr! .o"n'dence intervals
representedby I standarddeviation in the fo.e.irt.rro.
i.r..aur"'ur rn" ,qrur"
root of /.
FIGURE
15.1
Forecasting
a random
walk.

Ejr*rllr, '

€1 = !r+r - |r+t
:h * d l xr n t- l r - d - "r i

( 15.r r )

asbefore.The process,togetherwith forecastsand forecastconfidencein{crv'rls'
is illustrated in fig' ts Z. As can be seenin that figure' the forecastsittcrt'rst
t rxrl
linearly with l, and the standarderror of forecastincreaseswith the squarc
ofL
In the next chapter we examine a general class of stochastictimc-scrics
models.Later,we will seehow that classof modelscan be usedto make [orcc']rls
for a wide variety of time series First,however,it is necessaryto introducc sorrr(
proce5ses
and their propenies'
basicconceptsaboul srochastic
Time Series
and Nonstationary
rr('l
As we begin to developmodelsfor time series,we want to know whether or
l'('
the unde;lying stochasticprocessthat generatedthe seriescan be assume(llo
15.1,2

Stationary

invariantwith.respecttotime.Ifthechalactedsticsofthestochasticptoces
tII
over time, i.e., if the processis n\nstationary,it will often be difficult to repr(5(
algcbr''ri
the time seriesovei past and future intervals of time by a simple
i\
model.2On the other hand, if the stochasticprocessis fixed in time' i'e ' il il

'zTh e r a n d o m w a l kw i th d r i {ti so n e e xa m p l e o fa n o n sta ti o n a r yp r o ce ssfo r w h i ch a sl r r r l
forecastinqmodel can be constructed

al l)arlictllarotltc(nllcol'lhc lolll(
set ol (lata lxrlttlsI/r, , ' , ,y1r('Prcscllls
.,.yr).'Silrlilarly,a,/i;urcttltscrval
probabiliry
tilslrllrttlk rtt littlctlotlp(/r,
probttbilily
dislrihklit'|
yr+r can bc thorlSlllol ns []clngScllcratcdby a conditional
yl
distribulion.lor
r
probability
I givetltll('
a
is,
lhat
plyr*rlyt,
.
.
,.lr),
lunction
thcn' asonc wllosc
pastobse;ad;ns / r , . . , !t'. We definea st4tio,aryproccss,
with ras,C(llo
invatianl
are
both
distribution
conditional
and
ioint distribution
"displacement
in time. \n other words, if the seriesy, is stationary.thcn
PU, , . . . , ! *r ) =

and

PI L+^, ', Yf t k+n)

p(v'l: PlYn-l

( r 5.r 2)
( 15. 11)

for any t, k, and m.
Note thal if the seriesy1is stationary,tl.e meanof the series,defined as
h
FIGURE
15.2
Forecasting
a random
walkwithdritt.
stationary,then one can model the processvia an equation with fixed coefficients
that can be estimatedfrom past data. This is anal,ogousto the single_equation
regressionmodel in which one economicvariableis related to other economic
variables, with coefficients that are estimated under the assumption that the
structuralrelationship describedby the equation is invariant ovir time (i.e., is
staUonary).If the structuralrelationshipchangedover time, we could not apply
the techniquesof Chapter8 in using a regressionmodel to forecast.
The models developed in detail in the next chapter of the book represent
stochasticprocesses
that are assumedto be in equilibriumabout a consrantmean
level. The probability of a given fluctuation in the processfrom that mean level is
assumed to be the same at any point in tirne. Irr other words, the stochastic
properties of the stationary process are assumed to be invariant with respect
to
firne.
One would suspect thal many of the time series that one encounters in
The GNp, for
example, has for the most pan been growing steadily, and for ttris reason alone
its stochasticpropertiesin l98O are differentftom thosein 1933.Although it can
be difficult to model nonstationary processes,we will see that nonstatronary
processescan often be transformed into stationary or approximately
stationary
processes.

l5.l.l

Properties of Stationary processes

have said that any stochastictime seriesJ./r.
. , y1.canbe thought of as
JVe
having been generatedby a set ofjointly distributedrandom variables;i.e., the

: E(Y,)

(15.r4

must also be stationary,so that E(y,) = E(y'*-\, for any I and m Fu hermorc'
the varianceof the series,
,tj : El-(y,- ttyl'l

( 15. 15)

must be stationary,so that E[(/, - ltr)'] : El!,*. - ltvl2l, and flnally' for any
of the series
lag k. the covariance

- tt)l
7r : cov lY,,Y,**l : EUY,- P'vllY'*k

(15.16

must be stationary,so that Cov ly,, y*rl : Cov (y,a., yt*-*r1''
If a stochasticprocessis stationary,the probability distribution p(yr) is thc
samefor all time tlnd its shape(or at leastsomeofits properties)canbe inferrcd
by looking at a histogram of the observationsi/r, . ' , /r that make up thc
oiserved ieries. Also, an estimateof the mean pn of the processcan be obtaincd
ftom the samplemean of the serj.es

v:i2,,

( 15.l7)

r This outcomeis called a realizatioft.
Thusyr, . . . , rr representone pafticularlealizationoj thc
stochastrcprocessrepresentedby the probability distributionp(yr, ' .' ' . y,r')'
4It is p;ssible fo; the mean, variance,and covariancesof the seriesto be stationarybut not thc
joint probability distribution. If the probability distibutions are stationary, we tej.m the seriesslricl
sense
itationary.' tf ttre mean, variance, and covariances are stationary, we term the series r4lide'sefl
stationaiitybut that the converscis
rt4l,orary.Note that stdct-sensestationaity implies wide_sense
not tue,

and an cstinrateol thc variancc
,j:+2

(y' - ,)'z

I'.2
CHARACTERIZING
THE AUTOCORRELATION

(15.llr )

TIME SERIES:
FUNCTION

(r5.2,)

While it is usually impossibleto obtain a completedescriptionof a stochastic
process(i.e., actually specifythe underlying probability distributions),the auto_
correlationfunction is extremelyuseful becauseit providesa partial description
of the processfor modeling purposes.The autocorrelationfunction tells us how
much correlationthere is (and by implication how much interdependencytherc
is) betweenneighboringdata points in the seriesy,. We definethe 4rlocorrelation
with lag k as
pr:-@-covlYr'
V El,(y'- ltrl2lEI(!,-* - u,l'l

Y'+*l
oy.ct,-.

(15.19)

For a stationaryprocessthe varianceat time t in the denominatorofEq. (15.19)
is the same as the variance at time t * k; thus the denominator is iust the
varianceof the stochasticprocess,and
fi'

(I5.20)

Note that the numerator of Eq. (15.20) is the covariance between y1and yftk, ^yk,
so that

e r=#

is z( 'l( l ( or t lt lsc llt zct r r ) lir l 't ll
l i rt al l /, ' llr r r s il t llc, llllt t ( {) lt t 'l'lllt t t t lt t t t t liot t
l. li) r cc'r sllh( ' s( 'I i*'
i ' t' i i , i l ' "t . . i is I illlc r r r lr , v{llr lc lll t t slt t g't t t t t xlt 'l
llt ct t r t 't it it l'it t
llt ( , nut ( xt ) f t ( , l, t li0lllllllclioll in E( 1.( 15. 20) is lr t t lcly
ol corrr . sc
lilr
lil(
'( lt r t lr t ll) ( 'r( ) l
a
havc ot t ly
that i t dcs( r it ) csil slo( llnsll( lt r t t t esslit t which wc
tllc
ol
an stlllt"l//
'rttlotrrtlt l't
,,frr"ruu,i rnt. ltt placli(t', lllcll, wc trrustcalculatc
funclion:

( 15.21)

and thus pe : I for any stochasticprocess.
Supposethat the stochasticprocessis simply

lr5.221
where e, is an independentlydistributedrandom variablewith zeromean. Then
it is easyto seefrom Eq. (15.20) that the autocofielationfunction for this nrocess
is given by pe : l, po : 0 for ft ) 0. The processof Eq. (15.22) is called wftlre
zolie, and there is no model that can provide a forecastany better than
ir+t = O

theoreticaland estinrattrl
It is easyto seefrom their definitions that both the
autoconelationlunctions are symmetrical,i e ' that
the sameas th;t for a negativedisplacement'so that
a""fl..-."tft

( r 5 . 2 4)

plotting pr for diflcrcrrt
(i
Then, when plotting an autocorrelationfunction e '
k'
ot t1. one neid consideronly positivevaluesof
of the sampleautttvalue
particular
"J".t
a
wtrether
determine
tt i, often ,rs.t,l to
permit assumingthat the lr"
conetutio.t tu.rction pr is closeenough to zero Io
function-p1is indeedequalto zero.It is.alsouseful ttr
rui"a oi *t.
for k > 0 are equal to
""r"aorreiation
iJrt-*rt.tft.. oll tfte valuesof the autocorreladonfunction
white noise ) Fortunatelv'
r.-. tii,f,t"V ate, we know that we are dealingwith
the h!?othesisthat p1 = 0 for
,l-prJ=r,"iirii.a iestsexisrthar can be usedro resr
=
>
k or to lest the hypothesisthat pa 0 lor all k 0 .
-" olru.ufut
pl is equal
function
"ro-tatt *t arttar a particul;i value of the autocorreladon
serieshas
a
time
if
that
,o ,.ro *",rta a result obtainedby Bartlett He showed
coefficient
U.* gatt.rut.a Uy a white noisepiocess'the sampleautocorrelarion
distributionwilh
ir"irl-t"t ol are apiroximately distributedaccordingto a normal
of
trl4 lwhere ris the number observationsitr
;i,iu,id;Ja"uiu,iot'
;;;;;
say' 100 data points' we cart
,it.l.tf"t f .t ift"t, if a particular seriesconsistsof'
if a
coefficient
uttu.h u ,turrdu.d enoi of .I to each autocorrelation
-Therefore'
percen
be
95
could
we
2'
Darticularcoefficientwas greaterin magnitudethan
zero'
not
is
coefficient
autocorrelation
iure that the true
wc
"fo test theioint hypothesistb:ratall the autoco(elation coefficients are zero
irr
this
statistic
discuss
ifr. O ,tutittl. i"ttoa"."d by Box and Pierce'We will
ott
checks
"r"
f" chapter 17 in the context of performing diagnostic
r"-.-a.i"fi

5seeM's.Baftlett,,,ontheTheoleticalspecificationofsamplingPlopeniesofAulocone
so;iery'seL8!' vol 27'^19-46Also seeG E P Box an(
ri^""i".i..,'; 'lrr-rl of ie Roydl stdtisticdl
Times;ies An;b'sis (san Francisco:Holden-Day' 1970)
i. lr. l"Ji.t,

c s t ir llal" c d
t inlc -s c ri c sl t(x l c l s , s o l rt,rt' w r.orrl y l t(,l ti or) i l i n l ),rsS li .l ,l l ox n| | (l
Pierce show tltat thc statistic

n -

rS

rz

( rr .25)

is (approximately)distributedas chi squarewith K degreesof freedom.Thus il
the calculatedvalue of Q is greaterthan, say,the critical 5 percentlevel, we carr
be 95 percentsure that the true autocorrelationcoefficientspr, . . , pr are nor
all zero.
In practicepeople tend to use the cdtical l0 percentlevel as a cutoff for thjs
test.For example,if Q turned out to be 18.5for a total of 1( : I 5 lags,we woul(l
observethat this is below the critical level of 22.31 and acceptthe hypothesis
that the time serieswas generatedby a white noise process.
Let us now turn to an exampleof an estimatedautocorrelationfunction for a
stationaryeconomictime series.We have calculatedpl for quarterlydata on real
noniarm inventory investment(measuredin billions of 1982 dollars).The timc
seriesitself (coveringthe period 1952 through the first two quartersof 1988) is
shown in Fig. 15.3, and lhe sampleautocorrelationfunction is shown in Fis.
15.4. Note that the autoconelationfunction falls off rather quickly as the lag k
increases.
This is typical ofa stationarytime series,suchasinventory invesrmenr.
FIGURE
15.3
Nonfarm
inventory
investment
(in1982constant
do ars).

I

.l

:l
.3
.2
.1
0
-.1
.2
-.3
-.5
-_1
,.9
-t

15.4
FIGURE
onfunction
autocorrelat
sample
investfirent:
inventory
fulniarm
usedto test whclhct'
In fact, as we will see,the autocorrelationfunction can be
this is att
u t".i.t it ttationary. ff Dr doesnot fall off quickly as k incre,ases'
o.
nonstatiotrarof
tests
"oi
indication
ity ("unit root" tests)in SectionI5'3'
'ti
whi(ll
a time sedesis stadonary,there exist cenain analyticalconditions
points
ol llrc
individual
pf-a-to""at on the values titat can be taken by the
is
srrtrtr'
conditions
autocorrelationfunction. However, the derivation of these
thc
Furthermore'
*frui .ornpti.ut"a and will not be presentedat this loint .
in
al)usefulness
limited
.onJiriottt',tt.-t.lves are rather cumbersomeand of
modeling. Therefore,we have relegatedthem to Appendix
pii."a^iit*-t"ti"t
time serieswhich arc
i 5.1. w" ,,rrtt or,, attention now to the propertiesof those
senes'
stationary
nonstationarybut which can be transformedinto

Processes
Homogeneous Nonstationary
are stationary'Forltt
Probablyvery few of the time seriesone meetsin practice
(and this
encountered
series
many of the nonstationarytime
;;;ly, ir";.;".,
the
desirablc
have
and
incluies most of tfrosethat arisein economics
wiLl b(
series
rcsulting
the
times'
oneor more
pr"p"iy 1it"f U they arc dilferenced
ol
number
The
homogeneous
trii""oiy. such a nonstationary seriesis termed
s
resull
sedes
stationary
a
before
timesthat the original seriesmust be differenced
15,2.1

is c allc r llhc orrl rr,l l t,rrr.g rrrc i ty .,l ,h rrsi,t y, i s Ii t.sl _or(l (,r.
l to.tt)l .l (,' c.rs
tionary, the series
' ot)st,l
wt : lt - ! , t = Alr

(15.2(,)

rs stationary.Ify, happenedto be second_order
homogeneous,the scries
w, = A2!t = Lrr - Ay, r

lr5.27|

would be stationary.
As an exampleof a first-orderhomogeneousnonstationaryprocess,
considcr
the simple random walk processthat we introducedearlier:

rr

(15.28)

Let us examine the varianceof this process:
yo= E(y?): E[(y,-1+ e,)2]
: E(yl-,) + 2q1

yo=E (yl-,)+no!

lr5.2eJ
(15.30)

Observefrom this recursive relation that the variance is infinite and
hence
undefined.The sameis true for the covariances,since,for example,
^yt: E(h!, r):

Ely, r(-y,r * e,)l - E(yl )

(r5.31)

Now let us look at the sedesthat resultsfrom differencingthe random
walk
process,
i.e.,the series
(),5.32)
Sincethe e, are assumedindependentover dme, w, is clearly a stationary
process.Thus,we seethat the random walk processis first_orderhomogeneous.
In
fact, w, is just a white noiseprocess,and ii has the autocorrelationfunction :
po
l,butpn=0for ft>0.

15.2.2

Stationarity
and the Autocorrelation
Function
The.GNPor a seriesof salesfiguresfor a firm are both likely to be nonsratronary.
Eachhasbeengrowing (on average)over time, so that the mean of eachseriesls
time-dependent..Itis quite likely, however, that if the cNp or company sales
figures are first-differencedone or more times, the resulting serieswill be sta_

15.5
FIGURE
series.
Stationary

w('
tionary. Thus, if we want to build.a time-seriesmodel to forecastthe GNP'
s'
new
sctit
this
can diiferencethe seriesone or two times,constructa model for
ils
an(l
model
make our forecasts,and then integrate(i.e , undifference)the
forecaststo anive back at GNP
How can we decidewhether a seriesis stationaryor determinethe approl)ri'

to anive at a stationarysedes?we can begin by looking at a plot of the autocor). FiguresI 5' 5 and 15'6 show autocorrclarelation function \calleda correlogram
func'
don functionsfor stationaryand nonstationaryseries The autoconelation
bttt
large'
don for a stationaryseriesdropsoff as k, the number of lags,becomes
a
differencing
we
are
this is usually not the casefor a nonstationaryseries'If
thc
at
looking
by
nonstationaryseries,we can test each succeedingdifference

autocorrelationfunction.If,forexample,thesecondroundofdifferenci

15.6
FIGURE
series
Nonstationarv

rcsultsir) a scricswltosc aLrto(,orrclntion
lirrrr,lkrrrdrolrsoll rapidly,wc t.nrr
determinethat thc originalscricsis sccond-ordcr
hornogcncous.
lf tlrc rcsultirryi
seriesis still nonstationary,
thc autocorrclation
lunctionwill rcrnainlarsccvt.rr
lor long lags.

Example 15.1 Interest Rate Often in applied work it is not clear how
many trmesa nonstationaryseriesshould be differencedto yield a stationary
one, and one must make ajudgmentbasedon expedenceand intuition. As an
example,we will examinethe interestrate on 3-month govemmentTreasury
bills. This series,consisting of monthly data from the beginning of 1950
through June I988, is shown in Fig. |j.7, and its autocorrelationfunction is
shown in Fig. 15.8.The autocoffelationfunction doesdeclineas the number
of lagsbecomeslarge,but only very slowly. In addition, the seriesexhibitsan
upward trend (so that the mean is not constantover time). We would there_
fore suspectthat this serieshas been generatedby a homogeneousnonstationary process.To check,we differencethe seriesand recalculatethe samDle
The differencedseriesis shown in Fig. 15.9. Note that the mean of the
seriesis now about constant,although the variancebecomesunusually high
FIGURE
15,7
Three-month
Treasury
billrate.

,1
.6
.5
.3
.2
.l

0
-.1

-.5
-.1
-.8
-.9
-1

fl?S;t"llntt,""rr,,

lunction'
autocorrelation
bill ratersample

difierences'
birrrate-rirst
fl',?"115"t"?n1,""",.',

zol

0E
1950

I
.9

3r

I9 7 0

1950

1960

1965

tl!
.8
.1
.6
,5
.3

FIGURE
15.10
lnterest
fate-firstdifterencest
sample
autocorrelation
function.

during the early t980s (a period when the Federal Reserve
targeled the
money supply, allowing interestratesto fluctuate).The sample
autocorrela_
tion function for the differencedseriesis shown in fig. ti.fo.
It declines
rapidly, consistentwith a srationary series.We also tr]ed
differencingthe
seriesa secondUme. The twice_differencedseries,A2R,= Alir _
A&_r, is
shown in Fig. 15.1t, and its sampleaubcorreladonfunctionin
Fig. 15.12.
The results do not seem qualitatively different from the previous
case.Our
conclusion, then, would be that differencingonce shouid be sufncient
to
ensurestationarity.

Example 15.2,Daily Hog prices6 As a secondexample,let us
examinea
time_sedes
for the daily market price of hogs.If a forecaitingmodel could be
developedfor this series,one could conceivablymake monly by speculating
on the futuresmarket for hogsand using the model to outperfirm the
market.
6This exampte is from a paper
by R. leuthold, A Maccormick, A. Schmirz,and D. Watts,
"Forecastilg Daily_Hog pricei ind e;antities: A study of Alremativ.
f"*.irti-"g Techniques,,,
Journal of the American Statistiul Atsociatioh,March 1976, Applicatio".
i""ii.r" pp. SO_fOZ.

FIGU R 15.
E 11
bili rate-secondditferences
Treasury
Three-month
FIGU R 1
E5. 12
function'
sampleautocorrelation
inGre-JraG-se"onodiflerences:
l
.9
.8
.7

.6
,5
.4
.3
.2
,1
0
-.2

-.1
-.8
-.9
-t

l!.2.1

Functlott
Scas
l\ r t lr lion: n" t : llt . ll i lot llullor l
W c havc it t sl st 'ct r llr 'll lllc nlll{r ( 1'r f cl'lllor l
lxr ok wt '
r t ' t hc r r t r lainir lf lchal) t cI sol t lt is
about tl .tcslnt iot r , r lilyt 'f 't t r "t " "''lt t l'
lionr ils
oblaincd
bc
cat
r
ni"'it t n t int " scr ics
w i l l sce that ot lr ct ir llit r r r l'lliot l
rclatittttthc
*"tuntinu" h"t" by-c'xaminilrg
autocorrelatiolllirnctiort llowt'vtir'
a timc sclics
and the.seannal.rty^ol.
ii'nttio"
tro'i
ship between thc a u tocorrcla
A S di scussedint hcpr cvlouschapt er , seasonalit yiSjusla'cyclicalbchavit
lr
a highlv scasonartrrrre
calendar basis An example of
,#;;;;;;";egular

ililffiil;;;;i'I'zlx*'l1;:fi1?1i'.*f
III
:ilf,il'lil,il;lxli
ice cream and iced-tea mtx sno'

F I G URE ' 15.1 3
Sampleautocorrelation
functions
of dailyhogpricedata

The seriesconsistsof 250 daily data points coveringall the trading daysin
1965.The price variableis the averageprice in dollarsper hundredweightof
all hogssold in the eight regionalmarketsin the United Stateson a panicular
day. The sampleautocorrelationfunctionsfor the odginal price seriesand for
the flrst differenceofthe seriesare shown in Fig. 15.13.
Observethat the original seriesis clearly nonstationary.The autocorelation function barely declines,even after a l6-period lag. The seriesis, however, flrst-order hornogeneous,sinceits first differenceis clearly stationary.
In fact, not only is the first-differencedseriesstationary,but it appearsro
resemblewhite noise,sincethe sampleautocoffelationfunction /ir is closeto
zero for all k > 0. To determinewhether the differencedseriesis indeedwhite
noise, let us calculatethe Q statisticfor the first 15 lags. The value of this
statisticis 14.62,which, with l5 degreesoffreedom, is insignificantat the t0
percentlevel. We can thereforeconcludethar rhe differencedseriesis white
noiseand that the original price seriescan bestbe modeledas a randomwalk:
(15.33)

ffi;Jbrought
;".r;#
to
dccrcas('(l
response
in
years
7
ducdon shows seasonatEougn; #...,r"ry
oceancurrents'
in
the
chanse'
cvclical
auouruv
il;;i;;;;;ght
r)l llt(
;r? .rry ,o ,pot by directobscrvati(ttl
peatr rno rro,-rgh,
ofren seasonal
seasoTl pfa:t:5
conside^rably'
io*"uer, if the time"seriesfluctuates
,i-'.'rJ.r.
Recolrnr
from.the9]li:1T::".'t"'
norbearstinguishable
H;,';;;';;ni
tion of seasonalityi' trnponuttt 6"tu"se
recogrrithat
Fofiunately'
."uti"i u forecast.
itv., in the seriesthat can aid "t i"
runcrron
easier*lrtr rhe helpor rhe autocorrelarion point! irr lllc
;:":;;';;;;;
the data
seasonaliry-'
If a monthly time senesyt exhibits'an''ual
dal'r
of co-rrelation
series should show some Oegree
1'l:l.t^
,t."^Tto"nding
to
other words' we would expect st'c
*ftt.ft f""O or lag by 1i months ln
r"t"
r' will becorrcu"*t"" vlandv'-12sinceyrand-vl'"J#ilil;it;t"rutio"
y' and y' r'r'
between
corelation
alsosee
lated, aswill yr- r: u"o y,-2a' *t Jo"td
Thcsc
betweeny' andy' v' y' and /r-4s' etc
Similarlv there will bt tor'"tuttot'
auto,,'uttit"" tnJt*iues in ihe sample
:;#ili";;";iJ
can idenlilv
etc"
:
4i
)6'
lz' 24'
'
thls"l've
p1, which will exhibit pturt' ui |
everl il
ptuki it' the autocorrelationfunction'
seasonalityby obsewtng"gutut
ie discernedin the time sedesitself'
;";;;i';";it:,""ot

Production
r 5'3
vr e Hos
e'"rpt"
E ramPr c 1
+' ": :::lt]:*:i::,'::""ff
lt
"vY"- - : - - - l5l4ll'li
. shown in Fi8 ff
ol hogs I n t he unit ed it at et
produdion
the monthly
r:--^-in
ihil
sea:1i:'*:
discem
--."^-.tirv
toeasirY
woulo raKe d 5u'rLvr'ql ;''p.'i;il'v"
ll:ff1#y"'#".#;i
""".'.-^-' -n.*"u"r'
llj:ii
in its sampl('
apparent
is
serie-'
seasonalityofthe
r'c sc'r)u'o!Lr
senes
series.The
-' '^,. --^io''r'o*n

rr.a na,k( rhilr
in Fig t5 l5 Note the peaks thal

:ur ar K - ;:;n.'""lru.indicarins'""Y'r'1:T'.
l,i*i.:::l'
il:"#i;:"i
"uttttuut
cyiles l"deseasonalizing"tht'
t
of removing
urcurvu
A CIU(rE
crude method

cpri,'
-ai'
d'i:":"?-":"t3:i:t,ii:I"'TJli:
nkea l2+nonth
-i"
a",?l'*""rJ-i.,o
"i-',
r'-tA

As is the caseof most stock market prices,our best forecastof pr is its most
recentvalue, and (sadly) there is no model that can help us outpedorm the
InAIKCI.

= lr
i' .,"'u.'".
I ns'""
l,'
Zt 3'
rll- forj/,-t:
il
:' .:'";c;;;:ries
.
l?,1"
*.L
llf,tl:
f:'::".L"]:
"
doesnotexhibitstrongseasonal
dlllere
l2-month
this
function
.innrp tim.'-

lil'i):";"tti

",i: ff i#;;;;;;z'

timc
- extremerv
^--r-^*-rr, simpre
an
represents

rtl
1962

t96l

1965

1966

]JI
1968

1969

t97l

l
| 972

16

t:XtFt:J;l:"'

FIGURE1 5.1 4
Hog productlon(in thousandsoi hogs per month).Time boundsl
January1962to December1971.

orvr- vr-12
lunction
sampleautoco,,'elation

seriesmodel for hog production, sinceit accountsonly for the annual cycle.
We can completethis exampleby observingthat the autocorrelationfunction
in Fig. 15.16declinesonly slowly, so thar thereis somedoubt asto whether z,
is a stationaryseries.We thereforeflrst-differencedthis series,to obtain,yr =
Lzt : A(yt - y, r, ). The sampleautocorreladonfunction of this sedes,shown
in Fig. 15.17,declinesrapidly and remainssmall,so thar we can be confident
that ryris a stationary,nonseasonaltime sedes.

FIGURE'15.15
Sample
autocorrelation
function
forhogproduction
series.

lS'S':J,ll:"

o{A(vllunction
autocorrelation
sample

l'-e)

WALKS
TESTING FOR RANDOM

ntlnt+tr;i+fu}i
*;rr'*

I'.3

"'?;ff
#i:rrri#i:i.nt:*pffi

tior.rry'o.ly li'st-dilli'r't'rcirrg
will yickl sr.rri(,tt.rry
st,rit.s.
st,urnti,tll(.answ(.1
has implicationslilr uur rrrrtlcr.slarrding
ol thc cconotnyan(l li)r l.orccasll13.
Il ,l
vadablelike GNPfollowsa randomwalk, thc cllcctsol a tcmporaryshock(sLrtlr
asan increasein oil pricesor a drop in governmcntspcnding)will not dissipat(.
In a provocative study, CharlesNelson and Charlesplosserfound evidencc
that GNPand other macroeconomictime seriesbehavelike random walks.zThc
work spawneda seriesof studiesthat investigatewhether economicand finan
cial variablesare random walks or trend-reverting.Severalofthese studiesshow
that many economictime seriesdo appearto be random walks, or at leasthavc
random walk components.8Most of thesestudiesuse nit root testsinftoduced,by
'l
David Dickey and Wayne Fuller.,
Supposewe believethat a vadable I,,, which hasbeengrowing over trme,can
be describedby the following equation:
Y,= a t

B tI

p Y,-1 l e1

(\5.34)

One possibility is that f, has been growing becauseit has a positive trend
(p > 0), but would be stationaryafterdetrending(i.e.,p < l). in this case,f,
could be usedin a regression.and all the resultsand testsdiscussedin part One
ofthis book would apply. Another possibilityis that y/ hasbeengrowing because
it follows a random walk with a positive drift (i.e., a > O,p : O,ana p : f tn
1.
this case,one would want to work with Ayr. Detrendingwould not make the
sedesstationary, and inclusion of yr in a regression(even if detrended)could
One might think that Eq. (15.34) could be estimatedby OLS, and the I
statisticon, could then be usedto restwhether p is significantlydifferentfrom l.
However,aswe saw in Chapter9. if the true value of p is indeed I, then the OLS
estimatoris biasedtoward zero.Thus the use of OLSin this manner can lead one
to incorrectly reject the random walk hr,r:othesis.
Dickey and Fuller derivedthe distribuiion for the estimatord rhat holds when
p = l, and generatedsktisticsfor a simpleF tesrof rhe random walk hypothesis,
i.e., of the hypothesisthat B : 6 and p : 1. The Dickey-Fullertesr ls easyto
7 C. R. Nelson and C. I. plosser,,,Trendsand
Random Walks in MacroeconomicTime Sedes:
SomeEvidenceand Implications,"Jouftal of MonetaryEco ohlics,vol. I0, pp. t3g_t62, Da2.
3Examplesofthese studiesinciudeJ. y. Campbell
and N. G. Mankiw, ,,;re Output Fluctuations
Tr.ar\sitory?,"
vol. 102, pp.857-880, 1987,J. y. Campbelland N. c.
QuarterlyJaurnalof Econonics,
Mankiw, "Permanentand TransitoryComponentsin MacroeconomicFl'ctuatio's,,, AmericdllEco
nofiic ReviewPapertand proceedings,
vol.'77, pp. lll-117, 1987; and G. W. cardner and K. p.
Kimbrough, "The Behavior of U.S. Tarilf Rates,"AmericatlEcanofticReyiew,voj. 79, pp. 2ll_2t8,
r989.
e D. A. Dickey and W. A. Fulier, ,,Dist bution
of the Estimatorsfor Autoregressive
Time-series
with a Unit Roor," Jrrrkal of the AmericanStatistic!2l
Atsaciatioh,
vol. 74, pp. qiZ_q: t, tglg; O. L
Dickey and W. A. Fuller, ',Likelihood Rario Sraristicsfor Aurotegressiv;-TimeSerieswirh a Unir
Root," Econometrica,
vol. 49, pp. IO57-t072,1981; and W. A. F!i.ler,thtoAudio to Stat\$trcat
Time
(New York: Wiley, 19761.
Star'es

TA B LE151
D l S l R l B t JllOONl / lt ) ll ( r r ,/ 1,1t )

Standard
error

lt

l tl !/rY rIIrl

Probabilltyol a smallervalue

Sample
slze
25
50
100
250
500

(rr,0, 1)l N Y 1

74
. 76
. 76
.16
. 76
. 77
004

.90
93
94
.94
.94
.94
.004

.99

. 10

. 05

5 91

108
1.1',]

1. 38
1. 39
'1.39
1. 39

1 13

5 .4 7
539
5.34
015

.004

003

724
673
649
6.34
630
625

.020

8.65
7 81
744
725
7 .2 0
7 .1 6

.032

l0iir
93r
I 73
B43
834
B2/
058

V l p' 1063'1981
S ource D
r i c k eyand Fu l er op c i l ' Tabl e

versionof Eq ( 15 34) lt works 'l s
perform,and can be appliedto a more general
follows.
^"'i""ppot.
Y, can be describedby the following equation:
Yy: a I Pt + PYt-t* trrA/r-r * er

( 15. ] 5)

yt ' (Addidonallagsof AY1canbe includedon the righl'
whereAyr r:Yrt'?
one first runs the unrestriclc(l
hand side; the test is the same') Usingbts'
resression
Yt - Yr-,: d + Bt + (P - 1)Y' r * trrAYr-r

( 15. 16)

and then the restrictedregression
Yr -

Yr t :

a * \ r AYr r

( r r .r 7

:
to testwhether the restrictions(B 0'
Then, one calculatesthe standardF ratio
distribrtliorl
ratio, however, is nordistributedas.astandardl
;:liil;.i;ihis
use the distributionstabulatcd lly
under the null hypothesis'tnsteal' one must
in Table 15 1'
critical valuesfor this statisticare shown
'
il;;;ilil#:
"'*;";;,h.;"
in the standar(li
critical values are much larger than_rhose
2 and there arc 100
if the calculatedr ratio tur;s out to be 5
,"#;.;;;;;,

ro Recall that P is calculated as follows:
F - (N -' t)(E S S i

- E S S ur)/4(E S S ux )

ESsu"1,"
and
Essf
where
\$:-'::::tlJi:11":::'i'flrl;ffi:'Jff::H:1":ffiifi:;
sions.respeclively,N is the numDeror o
palamelerre'ttictions'
,rnrestrictedregression,and 4 is lhe number of

### Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay

×