Computer Aided Engineering Design

(backadmin) #1

348 COMPUTER AIDED ENGINEERING DESIGN


gx′ ≈

gx gx
i xx
i i
i i

()

() – ( )





–1
–1

substitution of which into the Newton Raphson formula gives


xx
gx x x
i i gx g x
iii
i i
+1
–1
–1

= –
()( – )
() – ( )
(12.8)

which is the iterative relation for the secant method. Note that two initial estimates are required to
initiate the procedure. The above relation is very similar to that in the regula falsi approach, however,
the difference is that the values xi–1 and xi may not necessarily bracket the root guess xi+1, and thus
the secant method may be divergent. Secondly, a derivative of the Newton-Raphson method, the
secant method is expected to converge much faster compared to the regula falsi approach.


12.3 Multivariable Optimization


In real life situations, there are usually more than one or a set of variables {x1,x 2 ,x 3 ,... , xn} which
determine the state in an engineering system. An overview is first given of classical methods in
multivariable optimization involving some definitions, mathematical models, theorems and solutions
to simple problems by way of Lagrange Multipliers. Subsequent sections deal with linear/nonlinear
unconstrained/constrained methods with emphasis on frequently used programming algorithms. However,
a detailed treatise on the subject can only be found in a text dedicated to optimization.


12.3.1 Classical Multivariable Optimization

First, the focus is on determining an optimal value of a function dependent on several variables when
the latter are not constrained, that is, the variables are not required to adhere to certain conditions. Let
f be a function of n variables of the form f(x1,x 2 ,... , xn)≡f (X). A point X 0 is an extremum if for
allX in the neighborhood of X 0 ,f(X)≤f(X 0 ) (relative maximum at X 0 ) or f(X)≥f(X 0 ) (relative
minimum at X 0 ). We can focus on minimization problems noting that maximization of f(X) can be
converted into the minimization of –f(X) or 1/f(X).


A function f(X) has an extreme point at X 0 if and only if



f()
=

X
X
0 at that point, where



f()X
X

is a vector denoting [(∂/∂x 1 )f(X), (∂/∂x 2 )f(X),... , (∂/∂xn)f(X)]. This gives the first order
necessary conditions for optimality of a multivariate function. Expanding f (X) in the neighborhood
ofX 0 using the Taylor series till the first term, we have


ff

f
( + ) = ( ) +

()
00
XXX X^0
X
ΔΔX



(12.9)^1

Forf(X 0 + ΔX)≥f(X 0 ) for a relative minimum at X 0 ,





f()
0

X 0
X
ΔX (12.10)

for any ΔX which will be satisfied if and only if




f()

(^0) =.
X
X
0 Note that the same can be argued for
(^1) ∂/∂Xf(X 0 ) implies that the partial derivatives of f(X) w.r.t. X are evaluated at X 0.

Free download pdf