Principles of Mathematics in Operations Research

(Rick Simeone) #1
14.2 Laplace Transforms 195

// we relax y'{t) = f(t), then we have

(^1 1)
V(s) = Vo + s — a
(s)
and
y(t) = eaty 0 + f ea^f(u)du;
Jo
where </>(s) is the Laplace transform of f(t).
Remark 14.2.7 In order to solve the matrix equation,
y'(t) = Ay(t) + f(t); 2/(0) = 2/o
we will take the Laplace transform as
r](s)(sI-A) = yo+4>(s).
where n(s) = [r)i(s),--- ,rjn(s)]T is the vector of Laplace transforms of the
components of y. If s is not an eigenvalue of A, then the coefficient matrix is
nonsingular. Thus, for sufficiently large s
V(s) = (si - Ay'yo + (si - A)-
(^1) ^)
where the matrix (si — A)-1 is called the resolvent matrix of A and
C(etA) = (si - A)'^1 for f{t) = 0.
Example 14.2.8 Let us take an example problem as Matrix exponentials.
The problem of finding etA for an arbitrary square matrix A of order n can
be solved by finding the Jordan form. For n > 3, one should use a computer.
However, we will show that how Cayley-Hamilton Theorem leads to another
method for finding etA when n = 2. Let us take the following system of equa-
tions
y[(t) = y 2 (t) + 1 t/i(0) = 3,
J/2() = !/i() + < 2/2(0) = 1.
Then,
01
10 , /(*) = 2/o
S~^1 AS =


s-


1

1 1
1 -1

etA-

_ 1 "
~ 2

01
10

1 1
1 -1

1 0
0-1
0
0 e'

ec + e"
Free download pdf