Pattern Recognition and Machine Learning

(Jeff_L) #1

564 12.CONTINUOUSLATENTVARIABLES


wherethe{Zni}dependontheparticulardatapoint,whereasthe{bdareconstants
thatarethesameforalldatapoints.Wearefreetochoosethe{Ui},the{Zni},and
the{bdsoastominimizethedistortion introducedbythereductionindimensional-
ity.Asourdistortionmeasure,weshallusethesquareddistancebetweentheoriginal

datapointXnanditsapproximationXn,averagedoverthedataset,sothatourgoal


is tominimize
N
J= ~L Ilxn - xn 112.
n=l

(12.11)

Considerfirstofalltheminimizationwithrespecttothequantities{Zni}.Sub-
stitutingforXn,settingthederivativewithrespecttoZnjtozero,andmakinguseof
theorthonormalityconditions,weobtain

(12.12)

wherej = 1,...,M.Similarly,settingthederivativeofJwithrespecttobitozero,


andagainmakinguseoftheorthonormalityrelations,gives

bj =-TX Uj (12.13)


wherej = M+1,...,D.IfwesubstituteforZniandbi,andmakeuseofthegeneral
expansion(12.9),weobtain

D
Xn- Xn = L {(Xn- x)TudUi
i=M+l

(12.14)

fromwhichweseethatthedisplacementvectorfromXn toxn liesinthespace


orthogonaltotheprincipalsubspace,becauseit isa linearcombinationof{udfor


i= M+1,...,D,asillustratedinFigure12.2.Thisis tobeexpectedbecausethe
projectedpointsxnmustliewithintheprincipalsubspace,butwecanmovethem
freelywithinthatsubspace,andsotheminimumerrorisgivenbytheorthogonal
projection.
Wethereforeobtainanexpression forthedistortionmeasureJasa function

purelyofthe{udintheform


1 ~ ~ (T _T)2 D T
J=NL L XnUi- X Ui = L UiSUi.
n=li=M+l i=M+l

(12.15)

ThereremainsthetaskofminimizingJwithrespecttothe{Ui},whichmust
bea constrainedminimizationotherwisewewillobtainthevacuousresultUi= O.
Theconstraintsarisefromtheorthonormalityconditionsand,asweshallsee,the
solutionwillbeexpressedintermsoftheeigenvectorexpansionofthecovariance
matrix.Beforeconsideringa formalsolution,letustryto obtainsomeintuitionabout

theresultbyconsideringthecaseofa two-dimensionaldataspaceD=2 anda one-


dimensionalprincipalsubspaceM =1.Wehavetochoosea directionU2soasto

Free download pdf