Biological Physics: Energy, Information, Life

(nextflipdebug5) #1

  1. Track 2[[Student version, January 17, 2003]] 205


T 2 Track 2


6.1′


  1. Communications engineers are also interested in the compressibility of streams of data. They
    refer to the quantityIas the “information content” per message. This definition has the unintuitive
    feature that random messages carry the most information! This book will instead use the word
    “disorder” forI;the word “information” will only be used in its everyday sense.

  2. Here is another, more elegant, proof that uniform probability gives maximum disorder. We’ll
    repeat the previous derivation, this time using the method of Lagrange multipliers. This trick
    proves indispensable in more complicated situations. (For more about this method, see for example
    Shankar, 1995.) We introduce a new parameterα(the “Lagrange multiplier”) and add a new term
    toI. The new term isαtimes the constraint we wish to enforce (that all thePiadd up to 1).
    Finally we extremize the modifiedIoverallthePiindependently,andoverα:


0=

d
dPj

(


I

NK−α

(

1 −

∑M

i=1

Pi

)

)

and 0 =
d

(


I

NK−α

(

1 −

∑M

i=1

Pi

)

)

0=

d
dPj

(M


1

PilnPi−α

(

1 −

∑M

i=1

Pi

)

)

and 1 =

∑M

i=1

Pi.

Minimizing as before,
0=lnPj+1+α;


once again we conclude that all thePjare equal.


6.2.1′



  1. Why do we need the Statistical Postulate? Most people would agree at first that a single helium
    atom, miles away from anything else, shielded from external radiation, is not a statistical system.
    Forinstance, the isolated atom has definite energy levels. Or does it? If we put the atom into an
    excited state, it decays at a randomly chosen time. One way to understand this phenomenon is to
    say that even an isolated atom interacts with everpresent weak, random quantum fluctuations of
    the vacuum.No physical system can ever be totally disconnected from the rest of the world.
    Wedon’t usually think of this effect as making the atom a statistical system simply because the
    energy levels of the atom are so widely spaced compared to the energies of vacuum fluctuations.
    Similarly, a billion helium atoms, each separated from its neighbor by a meter, will also have widely
    spaced energy levels. But if those billion atoms condense into a droplet of liquid helium, then the
    energy levels get split, typically into sublevels a billion times closer in energy than the original
    one-atom levels. Suddenly the system becomes much more susceptible to its environment.
    With macroscopic samples, in the range ofNmoleatoms, this environmental susceptibility be-
    comes even more extreme. If we suspend a gram of liquid helium in a thermally insulating flask, we
    may well manage to keep it “thermally isolated” in the sense that it won’t vaporize for a long time.
    But we can never isolate it from random environmental influences sufficient to change its substate.
    Thus, determining the detailed evolution of the microstate from first principles is hopeless. This
    emergent property is a key difference between bulk matter and single atoms. We therefore need
    anew principle to get some predictive power for bulk samples of matter. We propose to use the

Free download pdf