Understanding Machine Learning: From Theory to Algorithms
22.8 Exercises 321 (might) find a solution whose k-means objective is at leastt·OPT, where OPT is the minimum k-means objective. ...
322 Clustering Prove that for everyk >1 the k-diam clustering function defined in the previous exercise is not a center-based ...
23 Dimensionality Reduction Dimensionality reduction is the process of taking data in a high dimensional space and mapping it in ...
324 Dimensionality Reduction 23.1 Principal Component Analysis (PCA) Letx 1 ,...,xmbemvectors inRd. We would like to reduce the ...
23.1 Principal Component Analysis (PCA) 325 We further simplify the optimization problem by using the following elementary algeb ...
326 Dimensionality Reduction It is not hard to verify (see Exercise 2) that the right-hand side equals to∑ n j=1Dj,j. We have th ...
23.1 Principal Component Analysis (PCA) 327 −1.5−1.5 −1 −0.5 0 0.5 1 1.5 −1 −0.5 0 0.5 1 1.5 Figure 23.1A set of vectors inR^2 ( ...
328 Dimensionality Reduction x xxxxxx o oooooo * **** * * ++++++ Figure 23.2Images of faces extracted from the Yale data set. ...
23.2 Random Projections 329 23.2 Random Projections In this section we show that reducing the dimension by using a random linear ...
330 Dimensionality Reduction Then, with probability of at least 1 −δover a choice of a random matrixW∈Rn,d such that each elemen ...
23.3 Compressed Sensing 331 Compressed sensing is a technique that simultaneously acquires and com- presses the data. The key re ...
332 Dimensionality Reduction Proof We assume, by way of contradiction, thatx ̃ 6 =x. Sincexsatisfies the constraints in the opti ...
23.3 Compressed Sensing 333 theorem23.9 LetUbe an arbitrary fixedd×dorthonormal matrix, let,δ be scalars in(0,1), letsbe an int ...
334 Dimensionality Reduction Combining these two claims with Equation (23.5) we get that ‖h‖ 2 ≤‖hT 0 , 1 ‖ 2 +‖hT 0 c, 1 ‖ 2 ≤ ...
23.3 Compressed Sensing 335 Since‖hT 0 ‖ 2 +‖hT 1 ‖ 2 ≤ √ 2 ‖hT 0 , 1 ‖ 2 we therefore get that ‖WhT 0 , 1 ‖^22 ≤ √ 2 ‖hT 0 , 1 ...
336 Dimensionality Reduction where in the last inequality we used Stirling’s approximation. Overall we obtained that |Q|≤(2k+ 1) ...
23.3 Compressed Sensing 337 the condition given in the lemma, the following holds with probability of at least 1 −δ: sup v∈Q ∣∣ ...
338 Dimensionality Reduction 23.4 PCA or Compressed Sensing? Suppose we would like to apply a dimensionality reduction technique ...
23.6 Bibliographic Remarks 339 23.6 Bibliographic Remarks PCA is equivalent to best subspace approximation using singular value ...
340 Dimensionality Reduction if an eigenvalue decomposition of some matrixCis required, verify that this decomposition can be co ...
«
12
13
14
15
16
17
18
19
20
21
»
Free download pdf