216 7 Eigensystems
bii=aii,i 6 =k,i 6 =l
bik=aikcosθ−ailsinθ,i 6 =k,i 6 =l
bil=ailcosθ+aiksinθ,i 6 =k,i 6 =l
bkk=akkcos^2 θ− 2 aklcosθsinθ+allsin^2 θ
bll=allcos^2 θ+ 2 aklcosθsinθ+akksin^2 θ
bkl= (akk−all)cosθsinθ+akl(cos^2 θ−sin^2 θ)
The angleθis arbitrary. The recipe is to chooseθso that all non-diagonal matrix elements
bklbecome zero.
The algorithm is then quite simple. We perform a number of iterations until the sum over
the squared non-diagonal matrix elements are less than a prefixed test (ideally equal zero).
The algorithm is more or less foolproof for all real symmetric matrices, but becomes much
slower than methods based on tridiagonalization for large matrices.
The main idea is thus to reduce systematically the norm of theoff-diagonal matrix elements
of a matrixA
off(A) =
√n
∑
i= 1
n
∑
j= 1 ,j 6 =i
a^2 i j.
To demonstrate the algorithm, we consider the simple 2 × 2 similarity transformation of the
full matrix. The matrix is symmetric, we single out 1 ≤k<l≤nand use the abbreviations
c=cosθands=sinθto obtain
(
bkk 0
0 bll
)
=
(
c−s
s c
)(
akkakl
alkall
)(
c s
−s c
)
.
We require that the non-diagonal matrix elementsbkl=blk= 0 , implying that
akl(c^2 −s^2 )+ (akk−all)cs=bkl= 0.
Ifakl= 0 one sees immediately thatcosθ= 1 andsinθ= 0.
The Frobenius norm of an orthogonal transformation is always preserved. The Frobenius
norm is defined as
||A||F=
√n
∑
i= 1
n
∑
j= 1
|ai j|^2.
This means that for our 2 × 2 case we have
2 a^2 kl+a^2 kk+a^2 ll=b^2 kk+b^2 ll,
which leads to
off(B)^2 =||B||^2 F−
n
∑
i= 1
b^2 ii=off(A)^2 − 2 a^2 kl,
since
||B||^2 F−
n
∑
i= 1
b^2 ii=||A||^2 F−
n
∑
i= 1
a^2 ii+ (a^2 kk+a^2 ll−b^2 kk−b^2 ll).
This result means that the matrixAmoves closer to diagonal form for each transformation.
Defining the quantitiestanθ=t=s/cand
τ=
all−akk
2 akl
,
we obtain the quadratic equation
t^2 + 2 τt− 1 = 0 ,