Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1
7.20 Augmented Lagrange Multiplier Method 463

7.20.3 Mixed Equality–Inequality-Constrained Problems


Consider the following general optimization problem:

Minimizef (X) (7.255)

subject to

gj( X)≤ 0 , j= 1 , 2 ,... , m (7.256)
hj( X)= 0 , j= 1 , 2 ,... , p (7.257)

This problem can be solved by combining the procedures of the two preceding sections.
The augmented Lagrangian function, in this case, is defined as

A(X,λ, rk) =f(X)+

∑m

j= 1

λjαj+

∑p

j= 1

λm+jhj(X)

+rk

∑m

j= 1

α^2 j+rk

∑p

j= 1

h^2 j(X) (7.258)

whereαjis given by Eq. (7.253). The solution of the problem stated in E qs. (7.255)
to (7.257) can be found by minimizing the functionA, defined by Eq. (7.258), as in
the case of equality-constrained problems using the update formula

λ(k+^1 )=λ(k)j + 2 rkmax

{

gj( X),−

λ(k)j
2 rk

}

, j= 1 , 2 ,... , m (7.259)

λ(km++j^1 )=λ(k)m+j+ 2 rkhj( X), j= 1 , 2 ,... , p (7.260)

The ALM method has several advantages. As stated earlier, the value ofrkneed not
be increased to infinity for convergence. The starting design vector,X(^1 ), need not
be feasible. Finally, it is possible to achievegj( X)= 0 andhj( X)= 0 precisely and
the nonzero values of the Lagrange multipliers (λj= ) identify the active contraints 0
automatically.

Example 7.12

Minimizef (X)= 6 x 12 + 4 x 1 x 2 + 3 x 22 (E 1 )

subjectto

h(X)=x 1 +x 2 − 5 = 0 (E 2 )

using the ALM method.

SOLUTION The augmented Lagrangian function can be constructed as

A(X, λ, rk)= 6 x^21 + 4 x 1 x 2 + 3 x^22 + λ(x 1 +x 2 − 5 )

+rk(x 1 +x 2 − 5 )^2 (E 3 )
Free download pdf