Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1
7.9 Generalized Reduced Gradient Method 421

Since

[D]=

[

∂g 1
∂z 1

]

=[ 4 x^33 ] =[ 4 ( 1. 02595 )^3 ] =[ 4 .319551]

g 1 ( X)={− 2. 4684 }

[C]=

[

∂g 1
∂y 1

∂g 1
∂y 2

]

={[2(− 0. 576 + 0. 024 )][− 2 (− 0. 576 + 0. 024 )

+ 4 (− 0. 024 − 1. 02595 )^3 ]}

=[− 1. 104 − 3. 5258 ]

dZ=

1

4. 319551

[

2. 4684 − {− 1. 104 − 3. 5258 }

×

{

2. 024

− 2. 024

}]

= {− 0. 5633 }

we haveZnew=Zold+ dZ={ 2 − 0. 5633 } = { 1. 4367 }. The currentXnew
becomes

Xnew=

{

Yold+dY
Zold+dZ

}

=




− 0. 576

− 0. 024

1. 4367




The constraint becomes

g 1 = (− 0. 576 )( 1 −(− 0. 024 )^2 ) +( 1. 4367 )^4 − 3 = 0. 6842 = 0

Since this Xnew is infeasible, we need to apply Newton’s method
[Eq. (7.108)] at the currentXnew. In the present case, instead of repeating
Newton’s iteration, we can find the value ofZnew= {x 3 }newby satisfying
theconstraint as

g 1 (X)=(− 0. 576 )[1+(− 0. 024 )^2 ]+x 34 − 3 = 0

or x 3 = ( 2. 4237 )^0   5.^2 = 1. 2477

This gives

Xnew=




− 0. 576

− 0. 024

1. 2477




and

f (Xnew) =(− 0. 576 + 0. 024 )^2 + (− 0. 024 − 1. 2477 )^4 = 2. 9201

Next we go to step 1.

Step 1: We do not have to change the set of independent and dependent variables and
hence we go to the next step.

Free download pdf