Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1

104 Classical Optimization Techniques


4.x 1 +x 2 − 001 = 0 , x 1 +x 2 +x 3 − 501 =0: The solution of these equations
yields
x 1 = 05 , x 2 = 05 , x 3 = 05

This solution can be seen to satisfy all the constraint Eqs.(E 7 ) ot (E 9 ) The.
values ofλ 1 , λ 2 , andλ 3 corresponding to this solution can be obtained from
Eqs.(E 15 ) sa
λ 1 = − 20 , λ 2 = − 20 , λ 3 = − 100

Since these values ofλi satisfy the requirements [Eqs.(E 10 ) ot (E 12 ) , this]
solution can be identified as the optimum solution. Thus

x∗ 1 = 05 , x 2 ∗= 05 , x 3 ∗= 05

2.6 Convex Programming Problem


The optimization problem stated in Eq. (2.58) is called aconvex programming problem
if the objective functionf (X)and the constraint functionsgj( X)are convex. The
definition and properties of a convex function are given in Appendix A. Suppose that
f (X)andgj(X),j= 1 , 2 ,... , m, are convex functions. The Lagrange function of
Eq. (2.61) can be written as

L(X,Y,λ)=f (X)+

∑m

j= 1

λj[gj(X)+yj^2 ] (2.78)

Ifλj≥ , then 0 λjgj( X)is convex, and sinceλjyj= from Eq. (2.64), 0 L(X,Y,λ)
will be a convex function. As shown earlier, a necessary condition forf(X) to be a
relative minimum atX∗is thatL(X,Y,λ)have a stationary point atX∗. However, if
L(X,Y,λ)is a convex function, its derivative vanishes only at one point, which must
be an absolute minimum of the functionf(X). Thus the Kuhn–Tucker conditions are
both necessary and sufficient for an absolute minimum off(X) atX∗.

Notes:

1.If the given optimization problem is known to be a convex programming prob-
lem, there will be no relative minima or saddle points, and hence the extreme
point found by applying the Kuhn–Tucker conditions is guaranteed to be an
absolute minimum of f(X). However, it is often very difficult to ascertain
whether the objective and constraint functions involved in a practical engineer-
ing problem are convex.
2.The derivation of the Kuhn–Tucker conditions was based on the development
given for equality constraints in Section 2.4. One of the requirements for these
conditions was that at least one of the Jacobians composed of themconstraints
andmof then+mvariables (x 1 , x 2 ,... , xn;y 1 , y 2 ,... , ym) e nonzero. Thisb
requirement is implied in the derivation of the Kuhn–Tucker conditions.
Free download pdf