A First Course in FUZZY and NEURAL CONTROL

(singke) #1
34 CHAPTER 2. MATHEMATICAL MODELS IN CONTROL

Once again the property of observability is also a black and white issue. A
system either is or is not observable. A system that is observable can provide
the necessary conditions of the plant variables as described above. However, the
observed plant variables may not be sufficient to reconstruct the entire plant
dynamics.
There is a ìdualityî result that connects these two concepts. Namely, a
linear system whose state model is of the form


x ̇(t)=A(t)x(t)+B(t)u(t)
y(t)=C(t)x(t)

whereA,B,andCare matrices of appropriate sizes, is completely controllable
if and only if the ìdualî system


x ̇(t)=−AT(t)x(t)+CT(t)u(t)
y(t)=BT(t)x(t)

is completely observable. This result is related to the fact that a matrix and its
transpose have the same rank.


2.4 Stability................................


Stability analysis of a system to be controlled is thefirst task in control design.
In a general descriptive way, we can think of stability as the capacity of an object
to return to its original position, or to equilibrium, after having been displaced.
There are two situations for stability: (1) the plant itself is stable (before the
addition of a controller), and (2) the closed-loop control system is stable. All
controlled systems must be designed to be stable regardless of the stability or
instability of the plant. Controllers must be able to handle disturbances that
are not characterized by the model of the system, such as a gust of wind acting
on a car set to travel at a constant velocity, wind shear acting on a plane in
flight, and so on. This is illustrated in Figure 2.10.
The notion of stability can be viewed as a property of a system that is
continuously in motion about some equilibrium point. A point positionais
called anequilibrium pointof Equation 2.30 iff(a,t)=0for allt.By
changing variables,y=x−a, the equilibrium pointacan be transferred to
the origin. By this means, you can assume thata= 0. Thus, we will always
refer to the stability at the point 0. The stability of a dynamical system that is
described by a differential equation of the form


x ̇=

dx(t)
dt
=f(x,t) (2.30)

is referred to asstability about an equilibrium point.
When 0 is an equilibrium state of the system, the system will remain at 0
if started from there. In other words, ifx(t 0 )= 0 ,thenx(t)= 0 for allt≥t 0.
This is the intuitive idea of an equilibrium state.

Free download pdf