10.6 Further applications and Monte Carlo methods 329
done by Grassberger for polymer chains consisting of two different types of beads
[49]. This method is closely related tosimulated annealing[50]which is a special
version of the Metropolis algorithm. In simulated annealing, the merit function is
viewed as an energy. Applying the standard Metropolis ensemble, we would gener-
ate configurations distributed according to the Boltzmann factor exp[−E/(kBT)].
In simulated annealing, we slowly cool the system down. The idea behind this is
that at higher temperatures, the system can easily hop over the barriers, thereby
probing a large part of the phase space. On cooling the system, it samples only
the lower energy domains and finally ends up in one of the deepest valleys. It is
obviously efficient to use various samples of the system, so that in fact a popula-
tion of systems is cooled down and we finally take the one that has reached the
lowest energy. Alternatively, one could heat up the system again and then anneal
it once more and so on. It is always advisable in these simulations to combine the
stochastic algorithm with a deterministic minimum finder such as the conjugate
gradient method, in order to efficiently find the deepest point of an attraction basin
from each point visited.
The idea behind these method is called ‘configurational search’ for obvious reas-
ons. Another concept in this field is ‘energy sculpting’. This trick tries to overcome
the difficulty that a method focusing on the low-energy parts of the phase space will
automatically avoid the (sometimes) high (free) energy barriers separating the dif-
ferent energy minima. This is done by replacing the energy (i.e. the merit function)
by a modified one. An extreme version of this is thebasin hoppingmethod by Doye
and Wales which was used to find optimal conformations of Lennard–Jones clusters
[51]. In this method, the energy of a point is replaced by the energy minimum of the
attraction basin the point is in. This implies that all points in an attraction basin have
the same (modified) energy. The energy steps up or down when moving from one
attraction basin to another – the barriers between the basins are entirely removed.
Finally, another method deserves mention: the genetic algorithm (GA) [52, 53].
This algorithm is inspired by the ideas of evolution theory and employs these to
find optimal solutions to combinatorial or continuous problems. These algorithms
are based on encoding any point in phase space into a linear chain. This can be a
binary chain. For example, let us suppose that our merit functionfdepends onN
variablesxj:
f=f(x 1 ,...,xN). (10.63)
We restrict the range of acceptable valuesxjto some finite interval. Within this
interval, we allow for a number (256 or 512, say) of different equidistant val-
ues. These values can be coded as a bit-string. It is also possible to run the
algorithm with the string of realsx 1 ,...,xN. The algorithm now manipulates a
population of such strings. We start with a pool ofMindividuals. Then we do the