Biological Physics: Energy, Information, Life

(nextflipdebug5) #1

180 Chapter 6. Entropy, temperature, and free energy[[Student version, January 17, 2003]]


Equation 6.7 appears to involve logarithms of dimensional quantities, which aren’t defined (see
Section 1.4.1). Actually, formulas like this one are abbreviations: The term lnEAcan be thought
of as short for ln


(

EA/(1J)

)

.The choice of unit is immaterial; different choices just change the value
of the constant in Equation 6.7, which wasn’t specified anyway.
Wecan now ask, “What is the most likely value ofEA?” At first it may seem that the Statistical
Postulate says that all values are equally probable. But wait. The Postulate says that, just before
weshut the door between the subsystems, all microstates of the joint system are equally probable.
But there are many microstates of the joint system with any given value ofEA,andthe number
depends onEAitself.In fact, exponentiating the entropy gives this number (see Equation 6.5). So
drawing a microstate of the joint system at random, we are most likely to come up with one whose
EAcorresponds to the maximum of the total entropy. To find this maximum, set the derivative of
Equation 6.7 to zero:


0=
dStot
dEA

=

3

2

kB

(

NA

EA


NB

EB

)

. (6.8)

In other words, the systems are most likely to divide their thermal energy in such a way that each
has the same average energy per molecule:EA/NA=EB/NB.
This is a very familiar conclusion. Section 3.2.1 argued that in an ideal gas, the average energy
permolecule is^32 kBT(Idea 3.21 on page 74). So we have just concluded thattwo boxes of gas in
thermal equilibrium are most likely to divide their energy in a way that equalizes their temperature.
Successfully recovering this well-known fact of everyday life gives us some confidence that the
Statistical Postulate is on the right track.
How likely is “most likely?” To simplify the math, supposeNA=NB,sothat equal temperature
corresponds toEA=EB=^12 Etot. Figure 6.2 shows the entropy maximum, and the probability
distributionP(EA)tofind “A” with a given energy after we shut the door. The graph makes it
clear that even for just a few thousand molecules on each side, the system is quite likely to be
found very close to its equal-temperature point, because the peak in the probability distribution
function is very narrow. That is, the observed statistical fluctuations about the most probable
energy distribution will be small (see Section 4.4.3 on page 118). For a macroscopic system, where
NA≈NB≈ 1023 ,the two subsystems will beoverwhelminglylikely to share their energy in a way
corresponding tonearly exactlyequal temperatures.


6.3.2 Temperature is a statistical property of a system in equilibrium


The fact that two systems in thermal contact come to the same temperature is not limited to ideal
gases! Indeed, the early thermodynamicists found this property of heat to be so significant that
they named it theZeroth Lawof thermodynamics. Suppose we putanytwomacroscopic objects
into thermal contact. Their entropy functions won’t be the simple one we found for an ideal gas.
Wedoknow, however, that the total entropyStotwill have a big, sharp, maximum at one value
ofEA,since it’s the sum of a very rapidly increasing function ofEA(namelySA(EA)) plus a very
rapidly decreasing function^1 (namelySB(Etot−EA)), as shown in Figure 6.2. The maximum occurs
when dStot/dEA=0.
The previous paragraph suggests that wedefinetemperature abstractly as the quantity that
comes to equal values when two subsystems exchanging energy come to equilibrium. To implement


(^1) This argument also assumes that both of these functions are concave-down, or d (^2) S/dE (^2) <0. This is certainly
true in our ideal gas example, and according to Equation 6.9 below it expresses the fact that putting more energy
into a (normal) system raises its temperature.

Free download pdf