Biological Physics: Energy, Information, Life

(nextflipdebug5) #1

  1. Key formulas[[Student version, January 17, 2003]] 203


ofMletters isI=Nlog 2 Mbits (Equation 6.1).
Foravery long message whose letter frequencies,Pi,are known in advance, the disorder
is reduced to−KN

∑M

j=1PjlnPj (Shannon’s formula) (Equation 6.3). This is a positive
quantity. For example, ifP 1 =1and all otherPi=0,then the information per letter is zero:
The message is predictable.
Suppose there are Ω(E)states available to a physical system with energyE.Once the system
has come to equilibrium, its entropy is defined asS(E)=kBln Ω(E)(Equation 6.5).


  • Temperature: The temperature is defined asT=


(dS(E)
dE

)− 1

(Equation 6.9). If this system
is allowed to come to equilibrium in isolation from other systems, and then is brought into
thermal contact with another system, thenTdescribes the “availability of energy” which the
first system could give the second. If two systems have the sameT,there will be no net
exchange of energy (the “Zeroth Law” of thermodynamics).


  • Pressure: Pressure in a closed subsystem can be defined asp=TddSV (Equation 6.15). p
    can be thought of as the “unavailability of volume” from the subsystem, just asTis the
    “availability of energy.”

  • Sakur–Tetrode: The entropy of a box of ideal gas of volumeV,containingNmolecules with
    total energyE,isS=NkBln


[

E^3 /^2 V

]

(Equation 6.6), plus terms independent ofEandV.


  • Statistical Postulate: When a big enough, isolated system, subject to some macroscopic
    constraints, is left alone long enough, it evolves to an equilibrium. This is not one particular
    microstate, but rather a probability distribution. The distribution chosen is the one with
    the greatest disorder (entropy), that is, the one acknowledging the greatest ignorance of the
    detailed microstate subject to any given constraints (Idea 6.4).

  • Second Law: Any sudden relaxation of internal constraints (for example, opening an internal
    door) will lead to a new distribution, corresponding to the maximum disorder among a bigger
    class of possibilities. Hence the new equilibrium state will have entropy at least as great as
    the old one (Idea 6.11).

  • Efficiency: Free energy transduction is least efficient when it proceeds by the uncontrolled
    release of a big constraint. It’s most efficient when it proceeds by the incremental, controlled
    release of many small constraints (Idea 6.20).

  • Two-state: Suppose a subsystem has only two allowed states (“isomers”), differing in energy
    by ∆E.Then the probabilities to be in each of the two states are (Equation 6.25)


P 1 =

1

1+e−∆E/kBT

,P 2 =

1

1+e∆E/kBT

.

Suppose there is an energy barrierE‡between the two states. The probability per timek+
that the subsystem will hop to the lower state, if we know it’s initially in the upper state,
is proportional to e−E‡/kBT;the probability per timek−that it will hop to the higher state,
if we know it’s initially in the lower state, is proportional to e−(∆E+E
‡)/kBT
.For a complex
but effectively two-state system, analogous formulæ hold with ∆F or ∆Gin place of ∆E
(Equation 6.35).
If we prepare a collection of molecules in the two isomeric forms with populationsN 1 , 2 divided
in any way other than the equilibrium distributionN 1 eq, 2 ,then the approach to equilibrium is
exponential:N 1 , 2 (t)=N 1 eq, 2 ±Ae−(k++k−)t(Equation 6.30). HereAis a constant set by the
initial conditions.
Free download pdf