Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1

424 Nonlinear Programming III: Constrained Optimization Techniques


where [∇^2 L]n×ndenotes the Hessian matrix of the Lagrange function. The firstset of
equations in (7.125) can be written separately as

[∇^2 L]j Xj+[H]j λj= −∇Lj (7.128)

Using Eq. (7.127) for λjand Eq. (7.119) for∇Lj, Eq. (7.128) can be expressed as

[∇^2 L]j Xj+[H]j(λj+ 1 −λj) =−∇fj−[H]Tjλj (7.129)

which can be simplified to obtain

[∇^2 L]j Xj+[H]jλj+ 1 = −∇fj (7.130)

Equation (7.130) and the second set of equations in (7.125) can now be combined as
[
[∇^2 L [] H]
[H]T [0]

]

j

{

Xj
λj+ 1

}

=−

{

∇fj
hj

}

(7.131)

Equations (7.131) can be solved to find the change in the design vector Xj and
thenew values of the Lagrange multipliers,λj+ 1. The iterative process indicated by
Eq. (7.131) can be continued until convergence is achieved.
Now consider the following quadratic programming problem:

Find Xthat minimizes the quadratic objective function

Q= ∇fT X+^12 XT[∇^2 L] X

subject to the linear equality constraints (7.132)

hk+ ∇hTk X= 0 , k= 1 , 2 ,... , p or h+[H]T X= 0

The lagrange function,L ̃, corresponding to the problem of Eq. (7.132) is given by

L ̃= ∇fT X+^12 XT[∇^2 L] X+

∑p
k= 1

λk(hk+ ∇hTk X) (7.133)

whereλkis the Lagrange multiplier associated with thekth equality constraint.
The Kuhn–Tucker necessary conditions can be stated as

∇f+[∇^2 L] X+[H]λ= 0 (7.134)

hk+ ∇hTk X= 0 , k= 1 , 2 ,... , p (7.135)

Equations (7.134) and (7.135) can be identified to be same as Eq. (7.131) in matrix
form. This shows that the original problem of Eq. (7.117) can be solved iteratively
by solving the quadratic programming problem defined by Eq. (7.132). In fact, when
inequality constraints are added to the original problem, the quadratic programming
problem of Eq. (7.132) becomes

FindXwhich minimizesQ= ∇fT X+^12 XT[∇^2 L] X
Free download pdf