Pattern Recognition and Machine Learning

(Jeff_L) #1
132 2. PROBABILITY DISTRIBUTIONS

Mahalanobis distance∆is given by

VD|Σ|^1 /^2 ∆D (2.286)

whereVDis the volume of the unit sphere inDdimensions, and the Mahalanobis
distance is defined by (2.44).

2.24 ( ) www Prove the identity (2.76) by multiplying both sides by the matrix
(
AB
CD

)
(2.287)

and making use of the definition (2.77).

2.25 ( ) In Sections 2.3.1 and 2.3.2, we considered the conditional and marginal distri-
butions for a multivariate Gaussian. More generally, we can consider a partitioning
of the components ofxinto three groupsxa,xb, andxc, with a corresponding par-
titioning of the mean vectorμand of the covariance matrixΣin the form

μ=

(
μa
μb
μc

)
, Σ=

(
Σaa Σab Σac
Σba Σbb Σbc
Σca Σcb Σcc

)

. (2.288)


By making use of the results of Section 2.3, find an expression for the conditional
distributionp(xa|xb)in whichxchas been marginalized out.

2.26 ( ) A very useful result from linear algebra is theWoodburymatrix inversion
formula given by

(A+BCD)−^1 =A−^1 −A−^1 B(C−^1 +DA−^1 B)−^1 DA−^1. (2.289)

By multiplying both sides by(A+BCD)prove the correctness of this result.

2.27 ( ) Letxandzbe two independent random vectors, so thatp(x,z)=p(x)p(z).
Show that the mean of their sumy=x+zis given by the sum of the means of each
of the variable separately. Similarly, show that the covariance matrix ofyis given by
the sum of the covariance matrices ofxandz. Confirm that this result agrees with
that of Exercise 1.10.

2.28 ( ) www Consider a joint distribution over the variable

z=

(
x
y

)
(2.290)

whose mean and covariance are given by (2.108) and (2.105) respectively. By mak-
ing use of the results (2.92) and (2.93) show that the marginal distributionp(x)is
given (2.99). Similarly, by making use of the results (2.81) and (2.82) show that the
conditional distributionp(y|x)is given by (2.100).
Free download pdf