Engineering Optimization: Theory and Practice, Fourth Edition

(Martin Jones) #1
Problems 377

6.35 Consider the minimization of the function


f=
1
x 12 +x 22 + 2

Perform one iteration of Newton’s method from the starting point X 1 =

{ 4
0

}
using
Eq. (6.86). How much improvement is achieved withX 2?

6.36 Consider the problem:


Minimizef= 2 (x 1 −x^21 )^2 +( 1 −x 1 )^2

If a base simplex is defined by the vertices

X 1 =

{
0
0

}
, X 2 =

{
1
0

}
, X 3 =

{
0
1

}

find a sequence of four improved vectors using reflection, expansion, and/or contraction.

6.37 Consider the problem:


Minimizef=(x 1 + 2 x 2 − 7 )^2 +( 2 x 1 +x 2 − 5 )^2

If a base simplex is defined by the vertices

X 1 =

{
− 2
− 2

}
, X 2 =

{
− 3
0

}
, X 3 =

{
− 1
− 1

}

find a sequence of four improved vectors using reflection, expansion, and/or contraction.

6.38 Consider the problem:


f= 100 (x 2 −x^21 )^2 +( 1 −x 1 )^2

Find the solution of the problem using grid search with a step size
xi= 0 .1 in the range
− 3 ≤xi≤3,i= 1 ,2.

6.39 Show that the property of quadratic convergence of conjugate directions is independent
of the order in which the one-dimensional minimizations are performed by considering
the minimization of


f= 6 x^21 + 2 x^22 − 6 x 1 x 2 −x 1 − 2 x 2

using the conjugate directionsS 1 =

{ 1
2

}
andS 2 =

{ 1
0

}
and the starting pointX 1 =

{ 0
0

}
.

6.40 Show that the optimal step lengthλ∗i that minimizesf (X)along the search direction
Si= −∇fiis given by Eq. (6.75).


6.41 Show thatβ 2 in Eq. (6.76) is given by Eq. (6.77).


6.42 Minimizef= 2 x^21 +x 22 from the starting point (1,2) using the univariate method (two
iterations only).


6.43 Minimizef= 2 x^21 +x^22 by using the steepest descent method with the starting point
(1,2) (two iterations only).


6.44 Minimizef=x^21 + 3 x^22 + 6 x^23 by the Newton’s method using the starting point as
(2,− 1 ,1).

Free download pdf