The Mathematics of Financial Modelingand Investment Management

(Brent) #1

7-Optimization Page 203 Wednesday, February 4, 2004 12:50 PM


-------------------

Optimization 203

Given a multivariate function f(x 1 ,...,xn), consider the matrix
formed by the second order partial derivatives. This matrix is called the
Hessian matrix and its determinant, denoted by H, is called the Hessian
determinant (see Chapter 5 for definition of matrix and determinants):


2
f ∂
2
f
--------- ··· -------------------
∂x^2 ∂x^1 ∂x
1 n
· · ·
H = · ·
·
∂^2 f

·
· ·

2
f
··· ---------
∂x
∂x^2
1 ∂xn n

A point (a 1 ,...,an) is called a relative local maxima or a relative local
minima of the function fif the relationship

fa( 1 + h 1 , ..., xn+ hn ) ≤ fa( 1 , ..., an ) , h≤ d> 0

or, respectively,

fa( 1 + h 1 , ..., xn+ hn ) ≥ fa( 1 , ..., an ) , h≤ d> 0

holds for any real positive number d> 0.
A necessary, but not sufficient, condition for a point (x 1 ,...,xn) to be
a relative maximum or minimum is that all first order partial derivatives
evaluated at that point vanish, that is, that the following relationship
holds:

--------∂f - ∂f
grad[fx( 1 , ..., xn )] =  ...--------- = ( 0 , ..., 0 )
∂x 1 ∂xn

A point where the gradient vanishes is called a critical point.
A critical point can be a maximum, a minimum or a saddle point.
For functions of one variable, the following sufficient conditions hold:

■ If the first derivative evaluated at a point avanishes and the second
derivative evaluated at ais positive, then the point ais a (relative) min-
imum.
Free download pdf