Pattern Recognition and Machine Learning

(Jeff_L) #1
13.3. Linear Dynamical Systems 641

Figure 13.22 An illustration of a linear dy-
namical system being used to
track a moving object. The blue
points indicate the true positions
of the object in a two-dimensional
space at successive time steps,
the green points denote noisy
measurements of the positions,
and the red crosses indicate the
means of the inferred posterior
distributions of the positions ob-
tained by running the Kalman fil-
tering equations. The covari-
ances of the inferred positions
are indicated by the red ellipses,
which correspond to contours
having one standard deviation.

̂β(zn), which, for continuous latent variables, can be written in the form

cn+1̂β(zn)=


β̂(zn+1)p(xn+1|zn+1)p(zn+1|zn)dzn+1. (13.99)

We now multiply both sides of (13.99) bŷα(zn)and substitute forp(xn+1|zn+1)
andp(zn+1|zn)using (13.75) and (13.76). Then we make use of (13.89), (13.90)
Exercise 13.29 and (13.91), together with (13.98), and after some manipulation we obtain


μ̂n = μn+Jn

(
μ̂n+1−AμN

)
(13.100)

V̂n = Vn+Jn

(
V̂n+1−Pn

)
JTn (13.101)

where we have defined
Jn=VnAT(Pn)−^1 (13.102)
and we have made use ofAVn=PnJTn. Note that these recursions require that the
forward pass be completed first so that the quantitiesμnandVnwill be available
for the backward pass.
For the EM algorithm, we also require the pairwise posterior marginals, which
can be obtained from (13.65) in the form

ξ(zn− 1 ,zn)=(cn)−^1 ̂α(zn− 1 )p(xn|zn)p(zn|z− 1 )β̂(zn)

=

N(zn− 1 |μn− 1 ,Vn− 1 )N(zn|Azn− 1 ,Γ)N(xn|Czn,Σ)N(zn|μ̂n,V̂n)
cn̂α(zn)

.

(13.103)

Substituting for̂α(zn)using (13.84) and rearranging, we see thatξ(zn− 1 ,zn)is a
Gaussian with mean given with componentsγ(zn− 1 )andγ(zn), and a covariance
Exercise 13.31 betweenznandzn− 1 given by


cov[zn,zn− 1 ]=Jn− 1 V̂n. (13.104)
Free download pdf