Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1

86 Classical Optimization Techniques


choose to expressdx 1 in terms ofdx 2 , we would have obtained the requirement that
(∂g/∂x 1 )|(x 1 ∗, x∗ 2 )be nonzero to defineλ.Thus the derivation of the necessary conditions
by the method of Lagrange multipliers requires that at least one of the partial derivatives
ofg(x 1 , x 2 ) e nonzero at an extreme point.b
The necessary conditions given by Eqs. (2.34) to (2.36) are more commonly gen-
erated by constructing a functionL, known as the Lagrange function, as

L(x 1 , x 2 , λ) =f(x 1 , x 2 ) +λg(x 1 , x 2 ) (2.37)

By treatingLas a function of the three variablesx 1 , x 2 , and λ,the necessary conditions
for its extremum are given by
∂L
∂x 1

(x 1 , x 2 , λ)=

∂f
∂x 1

(x 1 , x 2 )+λ

∂g
∂x 1

(x 1 , x 2 )= 0

∂L
∂x 2

(x 1 , x 2 , λ)=

∂f
∂x 2

(x 1 , x 2 )+λ

∂g
∂x 2

(x 1 , x 2 )= 0

∂L
∂λ

(x 1 , x 2 , λ) =g(x 1 , x 2 )= 0

(2.38)

Equations (2.38) can be seen to be same as Eqs. (2.34) to (2.36). The sufficiency
conditions are given later.

Example 2.9 Find the solution of Example 2.7 using the Lagrange multiplier method:

Minimizef (x, y)=kx−^1 y−^2

subject to
g(x, y)=x^2 +y^2 −a^2 = 0

SOLUTION The Lagrange function is

L(x, y, λ)=f (x, y)+λg(x, y)=kx−^1 y−^2 + λ(x^2 +y^2 −a^2 )

The necessary conditions for the minimum off (x, y)[Eqs. (2.38)] give
∂L
∂x

= −kx−^2 y−^2 + 2 xλ= 0 (E 1 )

∂L
∂y

= − 2 kx−^1 y−^3 + 2 yλ= 0 (E 2 )

∂L
∂λ

=x^2 +y^2 −a^2 = 0 (E 3 )

Equations(E 1 ) nda (E 2 ) ieldy

2 λ=

k
x^3 y^2

=

2 k
xy^4

from which the relationx∗=( 1 /


2 )y∗can be obtained. This relation, along with
Eq.(E 3 ) gives the optimum solution as,

x∗=

a

3

and y∗=


2

a

3
Free download pdf