Principles of Mathematics in Operations Research

(Rick Simeone) #1
4.3 Diagonal Form of a Matrix 55

4.3 Diagonal Form of a Matrix

Proposition 4.3.1 Eigen vectors associated with distinct eigen values form
a linearly independent set.
Proof. Let Aj •H- vi, i = 1,..., k.
Consider YA=I a*Vi ~ ®- Multiply from the left by Ili=2(^ — -^1).
Since (A — Xil) = 9, we obtain (A — Xil)vj = (Xj — Xi)vj, which yields

ai(Aj - A 2 )(Ai - A 3 ) • • • (Ai - Xk)Vl = 6.

v\ 7^ 9, Ai - A2 7^ 0, ..., Ai - Afc ^ 0 => Qi = 0. Then, we have J27=2 aiv' = ®-
Repeat by multiplying J^i=3(yl — Xil) to get a 2 = 0, and so on. D

4.3.1 All Distinct Eigen Values

d{s) — nr=i (s~^i)- The n eigen vectors vi,... ,vn form a linearly independent
set. Choose them as a basis: {wi}™=1-
Avi = Aii>i + 0v 2 + 1- 0vn

Av 2 = 0^! + A 2 i>2 H h 0u„

Thus, A has representation A =

Avn = Owi + 0v 2 + h Xnvn
"Ai 0 ••• 0 "
0 A 2 • • • 0

0 0 • • • A„
Alternatively, let S — [f i|f2| • • • \vn]
AS = [Avi\Av 2 \---\Avn] = [X 1 v 1 \X 2 v 2 \---\Xnvn}

"Ai

AS = [vi\v 2 \

X 2

Xn

SA.

Thus, S XAS = A (Change of basis). Hence, we have proven the following
theorem.


Theorem 4.3.2 Suppose the n by n matrix A has n linearly independent
eigen vectors. If these vectors are columns of a matrix S, then


A,

S,_1yl5 = yl =

A 2

An
Free download pdf