7—Operators and Matrices 160
an operatorf, then consider the determinant ofM=I+f, whereIis the identity. This combination
is very close to the identity ifis small enough, so its determinant is very close to one. How close?
The first order inis called the trace off, or more formally
Tr(f) =
d
d
det
(
I+f
)
∣∣
∣
∣
=0
(7.46)
Express this in components for a two dimensional case, and
(f) =
(
a b
c d
)
⇒ det
(
I+f
)
= det
(
1 +a b
c 1 +d
)
= (1 +a)(1 +d)−^2 bc (7.47)
The first order coefficient ofisa+d, the sum of the diagonal elements of the matrix. This is the
form of the result in any dimension, and the proof involves carefully looking at the method of Gauss
elimination for the determinant, remembering at every step that you’re looking foronlythe first order
term in. See problem7.53.
7.8 Matrices as Operators
There’s an important example of a vector space that I’ve avoided mentioning up to now. Example 5
in section6.3is the set of n-tuples of numbers:(a 1 ,a 2 ,...,an). I can turn this on its side, call it a
column matrix, and it forms a perfectly good vector space. The functions (operators) on this vector
space are the matrices themselves.
When you have a system of linear equations, you can translate this into the language of vectors.
ax+by=e and cx+dy=f −→
(
a b
c d
)(
x
y
)
=
(
e
f
)
Solving forxandyis inverting a matrix.
There’s an aspect of this that may strike you as odd. This matrix is an operator on the vector
space of column matrices. What are the components of this operator? What? Isn’t the matrix a set of
components already? That depends on your choice of basis. Take an example
M=
(
1 2
3 4
)
with basis ~e 1 =
(
1
0
)
, ~e 2 =
(
0
1
)
Compute the components as usual.
M~e 1 =
(
1 2
3 4
)(
1
0
)
=
(
1
3
)
= 1~e 1 + 3~e 2
This says that the first column of the components ofMin this basis are
(
1
3
)
. What else would you
expect? Now select a different basis.
~e 1 =
(
1
1
)
, ~e 2 =
(
1
− 1
)
Again compute the component.