Ralph Vince - Portfolio Mathematics

(Brent) #1

316 THE HANDBOOK OF PORTFOLIO MATHEMATICS


In the multidimensional case, there are many good algorithms, yet there
is no perfect algorithm. Some methods work better than others for certain
types of problems. Generally, personal preference is the main determinant
in selecting a multidimensional optimization technique (provided one has
the computer hardware necessary for the chosen technique).
Multidimensional techniques can be classified according to five broad
categories.
First are thehill-climbing simplex methods. These are perhaps the least
efficient of all, if the computational burden gets a little heavy. However,
they are often easy to implement and do not require the calculation of
partial first derivatives. Unfortunately, they tend to be slow and their storage
requirements are on the order ofn^2.
The second family are thedirection set methods, also known as the
line minimization methodsorconjugate direction methods. Most notable
among these are the various methods of Powell. These are more efficient,
in terms of speed, than the hill-climbing simplex methods (not to be con-
fused with the simplex algorithm for linear functions mentioned earlier),
do not require the calculation of partial first derivatives, yet the storage
requirements are still on the order ofn^2.
The third family is theconjugate gradient methods. Notable among
these are the Fletcher-Reeves method and the closely related Polak-Ribiere
method. These tend to be among the most efficient of all methods in terms
of speed and storage (requiring storage on the order ofntimesx), yet they
do require calculations of partial first derivatives.
The fourth family of multidimensional optimization techniques are the
quasi-Newton,orvariable metric methods. These include the Davidson-
Fletcher-Powell (DFP) and the Broyden-Fletcher-Goldfarb-Shanno (BFGS)
algorithms. Like the conjugate gradient methods, these require calculation
of partial first derivatives, tend to rapidly converge to an extremum, yet
these require greater storage, on the order ofn^2. However, the tradeoff to
the conjugate gradient methods is that these have been around longer, are
in more widespread use, and have greater documentation.
The fifth family is thenatural simulationfamily of multidimensional
optimization techniques. These are by far the most fascinating, as they seek
extrema by simulating processes found in nature, where nature herself is
thought to seek extrema. Among these techniques are thegenetic algorithm
method, which seeks extrema through a survival-of-the-fittest process, and
simulated annealing, a technique which simulates crystallization, a process
whereby a system finds its minimum energy state. These techniques tend to
be the most robust of all methods, nearly immune to local extrema, and can
solve problems of gigantic complexity. However, they are not necessarily
the quickest, and, in most cases, will not be. These techniques are still so
new that very little is known about them yet.

Free download pdf