The Mathematics of Financial Modelingand Investment Management

(Brent) #1

7-Optimization Page 204 Wednesday, February 4, 2004 12:50 PM


204 The Mathematics of Financial Modeling and Investment Management

■ If the first derivative evaluated at a point a vanishes and the second
derivative evaluated at a is negative, then the point a is a (relative)
maximum.
■ If the first derivative evaluated at a point a vanishes and the second
derivative evaluated at a also vanishes, then the point a is a saddle point.

In the case of a function f(x,y) of two variables x,y, the following
conditions hold:

■ If ∇f = 0 at a given point a and if the Hessian determinant evaluated at
a is positive, then the function f has a relative maximum in a if fxx < 0
or fyy < 0 and a relative minimum if fxx > 0 or fyy > 0. Note that if the
Hessian is positive the two second derivatives fxx and fyy must have the
same sign.
■ If ∇f = 0 at a given point a and if the Hessian determinant evaluated at
a is negative, then the function f has a saddle point in a.
■ If ∇f = 0 at a given point a and if the Hessian determinant evaluated at
a vanishes, then the point a is degenerate and no conclusion can be
drawn in this case.

The above conditions can be expressed in a more compact way if we
consider the eigenvalues (see Chapter 5) of the Hessian matrix. If both
eigenvalues are positive at a critical point a, the function has a local
minimum at a; if both are negative the function has a local maximum; if
they have opposite signs, the function has a saddle point; and if at least
one of them is 0, the critical point is degenerate. Recall that the product
of the eigenvalues is equal to the Hessian determinant.
This analysis can be carried over in the three-dimensional case. In this
case there will be three eigenvalues, all of which are positive at a local
minimum and negative at a local maximum. A critical point of a function
of three variables is degenerate if at least one of the eigenvalues of the
Hessian determinant is 0 and has a saddle point if at least one eigenvalue
is positive, at least one is negative, and none is 0.
In higher dimensions, the situation is more complex and goes beyond
the scope of our introduction to optimization.

LAGRANGE MULTIPLIERS


Consider a multivariate function f(x 1 ,...,xn) of n real-valued variables.
In the previous section we saw that, if the n variables are unconstrained,
a local optimum of f can be found by solving the n equations:
Free download pdf