Understanding Machine Learning: From Theory to Algorithms

(Jeff_L) #1
C.2 Eigenvalues and Eigenvectors 431

C.2 Eigenvalues and Eigenvectors


LetA∈Rd,dbe a matrix. A non-zero vectoruis an eigenvector ofAwith a
corresponding eigenvalueλif
Au=λu.

theoremC.1 (Spectral Decomposition) IfA∈Rd,dis a symmetric matrix of
rankk, then there exists an orthonormal basis ofRd,u 1 ,...,ud, such that each
uiis an eigenvector ofA. Furthermore,Acan be written asA=

∑d
i=1λiuiu

>i,
where eachλiis the eigenvalue corresponding to the eigenvectorui. This can
be written equivalently asA=UDU>, where the columns ofUare the vectors
u 1 ,...,ud, andDis a diagonal matrix withDi,i=λiand fori 6 =j,Di,j =
0. Finally, the number ofλiwhich are nonzero is the rank of the matrix, the
eigenvectors which correspond to the nonzero eigenvalues span the range ofA,
and the eigenvectors which correspond to zero eigenvalues span the null space of
A.

C.3 Positive definite matrices


A symmetric matrixA∈Rd,dis positive definite if all its eigenvalues are positive.
Ais positive semidefinite if all its eigenvalues are nonnegative.

theoremC.2 LetA∈Rd,dbe a symmetric matrix. Then, the following are
equivalent definitions of positive semidefiniteness ofA:


  • All the eigenvalues ofAare nonnegative.

  • For every vectoru,〈u,Au〉≥ 0.

  • There exists a matrixBsuch thatA=BB>.


C.4 Singular Value Decomposition (SVD)


LetA∈Rm,nbe a matrix of rankr. Whenm 6 =n, the eigenvalue decomposition
given in Theorem C.1 cannot be applied. We will describe another decomposition
ofA, which is called Singular Value Decomposition, or SVD for short.
Unit vectorsv∈Rnandu∈Rmare called right and leftsingular vectorsof
Awith correspondingsingular valueσ >0 if

Av=σu and A>u=σv.

We first show that if we can findrorthonormal singular vectors with positive
singular values, then we can decomposeA=UDV>, with the columns ofUand
V containing the left and right singular vectors, andDbeing a diagonalr×r
matrix with the singular values on its diagonal.
Free download pdf