86 6 Computational Aspects
A~
1_
999
10
10989
100
10989
999 u
1010 1000
999 99
100
999
100
99
AA =
\A~X
-1 -
- J\max[(A-i)TA] =
10
-10 -§-1U 10
0--i- i-
" 10 100
AA =
28831
277
0 0 100
0 1 0
10 0
= 10.2021.
=> x 2 =
in
100
111
10
111
100
=>• Ax — x 2 — x\
li
100
101
10
11
100
=> 114*11 =
V1020342
100
= 10.1012, and
\AA\\ = \J \ma.x\ATAAA}
14963
146
= 10.1236.
10.1012 10.1236 \\AA\
< c = c-
sft
10.1236
100.5099
Jz + AJ ^3 " 100.5099" " \\A\\
= 57.9 < c = P|| |U_1|| = 100.5099(10.2021) = 1025.412.
The relative amplification in this instance is 51.9 whereas the theoretic upper
bound is 1025412.
Remark 6.1.18 The following are the main guidelines in practise:
c and \\A\\ are never computed but estimated.
c explains why AT Ax = ATb are so hard to solve in least squares problems:
c(ATA) = [C(J4)]^2 where c(.) is the condition number. The remedy is to
use Gram-Schmidt or singular value decomposition, A = Qi£Q%. The
entries a, in S are singular values of A, and a\ are the eigen values of
ATA. Thus, \\A\\ = amax. Recall that \\Ax\\ = \\QiEQ^x\\ = \\Ex\\.
6.2 Computation of eigen values
There is no best way to compute eigen values of a matrix. But there are
some terrible ways. In this section, a method recommended for large-sparse
matrices, the power method, will be introduced.
Let wo be initial guess. Then, u^+i = Auk — Ak+1uo- Assume A has full
set of eigen vectors xi,x 2 , • • • ,xn, then Uk = aiAjXi H -an\l^xn. Assume
further that Ai < A2 < • • • < An_i < An; that is, the last eigen value is not
repeated.