Pattern Recognition and Machine Learning

(Jeff_L) #1

602 12.CONTINUOUSLATENTVARIABLES


12.22 (**) Writedownanexpressionfortheexpectedcomplete-dataloglikelihoodfunc-
tionforthefactoranalysismodel,andhencederivethecorrespondingMstepequa-
tions(12.69)and(12.70).

12.23 (*)III!I Drawa directedprobabilisticgraphicalmodelrepresentinga discrete
mixtureofprobabilisticPCAmodelsinwhicheachPCAmodelhasitsownvalues
ofW,JL,and0-^2 • Nowdrawa modifiedgraphinwhichtheseparametervaluesare
sharedbetweenthecomponentsofthemixture.

12.24 (***) WesawinSection2.3.7thatStudent'st-distributioncanbeviewedasan
infinitemixtureofGaussiansinwhichwemarginalizewithrespecttoa continu-
ouslatentvariable. Byexploitingthisrepresentation,formulateanEMalgorithm
formaximizingtheloglikelihoodfunctionfora multivariateStudent'st-distribution
givenanobservedsetofdatapoints,andderivetheformsoftheE andMstepequa-
tions.

12.25 (**)III!I Considera linear-Gaussianlatent-variablemodelhavinga latentspace


distributionp(z) =N(xIO,I)anda conditionaldistributionfortheobservedvari-


ablep(xlz) = N(xlWz+IL,<p)where<Pisanarbitrarysymmetric,positive-
definitenoisecovariancematrix. Nowsupposethatwemakea nonsingularlinear

transformationofthedatavariablesx ---t Ax,whereA isaD x Dmatrix. If


JLML'WMLand<PMLrepresentthemaximumlikelihoodsolutioncorrespondingto
theoriginaluntransformeddata,showthatAJLML'AWML,andA<PMLATwillrep-
resentthecorrespondingmaximumlikelihoodsolutionforthetransformeddataset.
Finally,showthattheformofthemodelis preservedintwocases:(i)Ais a diagonal
matrixand<Pisa diagonalmatrix. Thiscorrespondstothecaseoffactoranalysis.
Thetransformed<Premainsdiagonal,andhencefactoranalysisiscovariantunder
component-wisere-scalingofthedatavariables;(ii)Ais orthogonaland<Pis pro-
portionaltotheunitmatrixsothat<P= 0-^2 1.ThiscorrespondstoprobabilisticPCA.
Thetransformed<Pmatrixremainsproportionaltotheunitmatrix,andhenceproba-
bilisticPCAis covariantundera rotationoftheaxesofdataspace,asisthecasefor
conventionalPCA.
\
12.26 (**) Showthatanyvectoraithatsatisfies(12.80)willalsosatisfy(12.79).Also,
showthatforanysolutionof(12.80)havingeigenvalueA,wecanaddanymultiple
ofaneigenvectorofK havingzeroeigenvalue, andobtaina solutionto(12.79)
thatalsohaseigenvalueA. Finally,showthatsuchmodificationsdonotaffectthe
principal-componentprojectiongivenby(12.82).

12.27 (**) ShowthattheconventionallinearPCAalgorithmis recoveredasa specialcase
ofkernelPCAifwechoosethelinearkernelfunctiongivenbyk(x,x')= xTx'.

12.28 (**)III!I Usethetransformationproperty(1.27)ofa probabilitydensityunder
a changeofvariabletoshowthatanydensityp(y)canbeobtainedfromafixed
densityq(x)thatiseverywherenonzerobymakinga nonlinearchangeofvariable

y = f(x)inwhichf(x)isa monotonicfunctionsothat 0 :::; j'(x) < 00.Write


downthedifferentialequationsatisfiedbyf(x)anddrawa diagramillustratingthe


transformationofthedensity.
Free download pdf