Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1
6.12 Marquardt Method 349

whereλ∗i is found using any of the one-dimensional search methods described in
Chapter 5.


Example 6.12 Minimizef (x 1 , x 2 )=x 1 −x 2 + 2 x 12 + 2 x 1 x 2 +x^22 from the starting


point X 1 =


{ 0

0

}

using Marquardt method with α 1 = 014 , c 1 =^14 , c 2 = , and 2
ε= 10 −^2.


SOLUTION


Iteration 1 (i= 1 )


Heref 1 = f(X 1 ) = 0. 0 and


∇f 1 =








∂f
∂x 1
∂f
∂x 2








( 0 , 0 )

=

{

1 + 4 x 1 + 2 x 2
− 1 + 2 x 1 + 2 x 2

}

( 0 , 0 )

=

{

1

− 1

}

Since||∇f 1 || = 1. 4142 >ε, we compute


[J 1 ]=

     

∂^2 f
∂x^21

∂^2 f
∂x 1 x 2

∂^2
∂x 1 x 2

∂^2 f
∂x^22

     

( 0 , 0 )

=

[

42

2 2

]

X 2 =X 1 − [[J 1 ]+α 1 [ ]I]−^1 ∇f 1

=

{

0

0

}


[

4 + 104 2

2 2 + 104

]− (^1) {
1
− 1


}

=

{

− 0. 9998

1. 0000

}

10 −^4

Asf 2 = f(X 2 ) =− 1. 9997 × 10 −^4 < f 1 , we setα 2 =c 1 α 1 = 500, 2 i=2, and proceed
to the next iteration.


Iteration 2 (i= 2 )


The gradient vector corresponding to X 2 is given by ∇f 2 =


{ 0 998. 9

− 1. 0000

}

,||∇f 2 || =



  1. 4141 >ε, and hence we compute


X 3 =X 2 − [[J 2 ]+α 2 [ ]I]−^1 ∇f 2

=

{

− 0. 9998 × 10 −^4

1 000. 0 × 10 −^4

}


[

2504 2

2 2502

]− 1 {

0. 9998

− 1. 0000

}

=

{

− 4. 9958 × 10 −^4

5 000. 0 × 10 −^4

}

Sincef 3 = f(X 3 ) =− 0. 9993 × 10 −^3 < f 2 , we setα 3 =c 1 α 2 = 256 , i=3, and pro-
ceed to the next iteration. The iterative process is to be continued until the convergence
criterion,||∇fi|| <ε, is satisfied.

Free download pdf