Anon

(Dana P.) #1

Fundamentals of Matrix Algebra 391


Linear Independence and Rank


Consider an n × m matrix A. A set of p columns extracted from the matrix A


⋅⋅⋅
⋅⋅⋅⋅⋅
⋅⋅⋅⋅⋅
⋅⋅⋅⋅⋅
⋅⋅⋅



aa

aa

ii

ni ni

p

p

(^111)
1
,,
,,











     

are said to be linearly independent if it is not possible to find p constants
β=s,1sp,..., such that the following n equations are simultaneously
satisfied:


ββ 11 aa,,ip 1 ... 1 ip 0
++ =

ββ 1 aani,, 1 ++... pnip= 0

Analogously, a set of q rows extracted from the matrix A are said to be
linearly independent if it is not possible to find q constants λ=s,1sq,...,
such that the following m equations are simultaneously satisfied:


λλ 11 aaiq 1 ,,... iq 1 0
++ =

λλ 1 aaim 1 ,,++... qiqm= 0

It can be demonstrated that in any matrix the number p of linearly inde-
pendent columns is the same as the number q of linearly independent rows.
This number is equal, in turn, to the rank r of the matrix. Recall that an
n × m matrix A is said to be of rank r if at least one of its (square) r-minors
is different from zero while all (r + 1)-minors, if any, are zero. The constant
p, is the same for rows and for columns. We can now give an alternative
definition of the rank of a matrix: Given a n × m matrix A, its rank, denoted
rank(A), is the number r of linearly independent rows or columns as the row
rank is always equal to the column rank.


Vector and Matrix Operations


Let’s now introduce the most common operations performed on vec-
tors and matrices. An operation is a mapping that operates on scalars,

Free download pdf