7.9 Generalized Reduced Gradient Method 413
By adding a nonnegative slack variable to each of the inequality constraints in
Eq. (7.80), the problem can be stated as
Minimizef (X) (7.83)
subject to
hj(X)+xn+j= 0 , j= 1 , 2 ,... , m (7.84)
hk( X)= 0 , k= 1 , 2 ,... , l (7.85)
xi(l)≤xi≤xi(u), i= 1 , 2 ,... , n (7.86)
xn+j≥ 0 , j= 1 , 2 ,... , m (7.87)
withn+mvariables (x 1 , x 2 ,... , xn, xn+ 1 ,... , xn+m). The problem can be rewritten
in a general form as:
Minimizef (X) (7.88)
subject to
gj( X)= 0 , j= 1 , 2 ,... , m+l (7.89)
x(l)i ≤xi≤xi(u), i= 1 , 2 ,... , n+m (7.90)
where the lower and upper bounds on the slack variable,xi, are taken as 0 and a large
number (infinity), respectively (i=n+ 1 , n+ 2 ,... , n+m).
The GRG method is based on the idea of elimination of variables using the equality
constraints (see Section 2.4.1). Thus theoretically, one variable can be reduced from
the setxi( i= 1 , 2 ,... , n+m) for each of them+lequality constraints given by
Eqs. (7.84) and (7.85). It is convenient to divide then+mdesign variables arbitrarily
into two sets as
X=
{
Y
Z
}
(7.91)
Y=
y 1
y 2
..
.
yn−l
=design or independent variables (7.92)
Z=
z 1
z 2
..
.
zm+l
=state or dependent variables (7.93)
and where the design variables are completely independent and the state variables
are dependent on the design variables used to satisfy the constraintsgj( X)= 0 ,j=
1 , 2 ,... , m+l.