10.3 Importance sampling through Markov chains 299
does not affect the sum. In fact, it is possible to generate artificial number sequences
for which no attempt made to achieve (pseudo-) randomness but which fill a high-
dimensional volume very homogeneously so that they are suitable for integration.
The resulting method is called ‘quasi-Monte Carlo’ [9].
10.3 Importance sampling through Markov chains
We now explain the importance sampling method for classical many-particle sys-
tems in the canonical or(NVT)ensemble; extensions to other ensembles will be
discussed later in this chapter. When calculating averages in the(NVT)ensemble,
the configurations should be weighted according to the Boltzmann factor
ρ(X)∝exp[−βE(X)], β= 1 /(kBT). (10.8)
This suggests that we apply MC integration with importance sampling, as the phase
space over which we must integrate is high-dimensional. The method that first
comes to mind for generating phase space points (configurations) with a Boltzmann
distribution is the Von Neumann method discussed inAppendix B3. In our case this
would mean generating random configurations and accepting them with probabil-
ity exp[−βE(X)]where the energy scale is assumed to be such that the energy is
always positive. However, as the number of configurations with a particular energy
increases exponentially with energy, most of the randomly constructed states have
very high energy. Hence for finite temperature they will be accepted with a vanish-
ingly small probability and we spend most of our time generating configurations
that are then rejected, which is obviously very inefficient.
Another method would be to construct statistically independent configurations
with a bias towards lower energies in accordance with the Boltzmann weight.
However, recipes for such a construction are difficult and will certainly be very
time-consuming. Related to this method is the Metropolis method [10], developed
in 1953, which abandons the idea of constructing statistically independent config-
urations – rather, the configurations are constructed through a so-called Markov
chain, in which each new configuration is generated with a probability distribution
depending on the previous configuration.
Before considering Markov chains, we considertruly random,oruncorrelated
chains. For an uncorrelated chain, the probability of occurrence of a particular
sequence ofNobjectsX 1 ,...,XNis statistically uncorrelated:
PN(X 1 ,X 2 ,...,XN)=P 1 (X 1 )P 1 (X 2 )···P 1 (XN) (10.9)
whereP 1 (X)is the independent probability of occurrence for the objectX(this
probability is assumed to be equal for each step). Truly random number sequences
are examples of uncorrelated chains (seeAppendix B). A Markov chain is different