Biological Physics: Energy, Information, Life

(nextflipdebug5) #1

174 Chapter 6. Entropy, temperature, and free energy[[Student version, January 17, 2003]]


Biological question: If energy is always conserved, how can some devices be more efficient than
others?
Physical idea:Ordercontrols when energy can do useful work, and it’snotconserved.


6.1 How to measure disorder


Chapter 1 was a little vague about the precise meaning of “disorder.” We need to refine our ideas
before they become sharp tools.
Flip a coin a thousand times. You get a random sequencehttthtththhhthh.... We will say
that this sequence contains lots ofdisorder,inthe following sense: It’s impossible to summarize a
random sequence. If you want to store it on your computer, you need 1000 bits of hard disk space.
Youcan’t compress it; every bit is independent of every other.
Now let’s consider the weather, rain/shine. You can take a thousand days of weather and
write it as a bit streamrsssrssssrrrsrr.... But this stream isless disorderedthan the coin-flip
sequence. That’s because today’s weather is more likely to be like yesterday’s than different. We
could change our coding and let “0”=same as yesterday, “1”=different from yesterday. Then our
bitstream is 10011000100110... , and it’s not perfectly unpredictable: It has more 0’s than 1’s. We
could compress it by instead reporting the length of each run of similar weather.
Here is another point of view: You could make money betting even odds on the weather every
day, because you have some a priori knowledge about this sequence. You won’t make money
betting even odds on a coin flip, because you have no such prior knowledge. The extra knowledge
youhaveabout the weather means that any actual string of weather reports is less disordered than
acorresponding string of coin flips. Again: The disorder in a sequence reflects its predictability.
High predictability is low disorder.
Westill need to propose a quantitative measure of disorder. In particular we’d like our measure
to have the property that the amount of disorder in two uncorrelated streams is just the sum of that
in each stream separately. It’s crucial to have the word “uncorrelated” in the preceding sentence.
If you flip a penny a thousand times, and flip a dime a thousand times, those are two uncorrelated
streams. If you watch the news and read the newspaper, those are two correlated streams; one can
beused to predict partially the other, so the total disorder is less than the sum of those for the two
streams.
Suppose we have a very long stream of events (for example, coin flips), and each event is drawn
randomly, independently, and with equal probability from a list ofMpossibilities (e.g.M=2for
acoin; or 6 for rolling a die). We divide our long stream into “messages” consisting ofNevents.
Weare going to explore the proposal that a good measure for the amount of disorder per message
isI≡Nlog 2 M,orequivalentlyKNlnM,whereK=1/ln 2.
It’s tempting to glaze over at the sight of that logarithm, regarding it as just a button on your
calculator. But there’s a simple and much better way to see what the formula means: Taking the
caseM=2(coin flip) shows that, in this special case,Iis just the number of tosses. More generally
wecan regardIas the number of binary digits, orbits,needed to transmit the message. That is,
Iis the number of digits needed to express the message as a big binary number.
Our proposal has the trivial property that 2Ncoin tosses give a message with twice as much
disorder asNtosses. What’s more, suppose we toss a coinandroll a dieNtimes. ThenM=
2 ×6=12andI=KNln 12 =KN(ln 2 + ln 6), by the property of logarithms. That makes sense:
Wecould have reorganized each message asNcoin flips followed byNrolls, and we wanted our

Free download pdf