New Scientist - USA (2019-12-07)

(Antfer) #1

34 | New Scientist | 7 December 2019


Driver of


disorder


Entropy’s inexorable push towards chaos


seems to give structure to the universe,


says Michael Brooks. So why can nobody


agree on what it is?


A


LL the King’s horses and all the King’s
men couldn’t put Humpty together
again. Everyone knows the sorry tale
of Humpty Dumpty, but have you ever noticed
that the rhyme makes no mention of an egg?
In fact, the ill-fated protagonist only assumed
egg-man form when he met Alice in Lewis
Carroll’s Through the Looking Glass, after
which broken eggs became indelibly associated
with irreversible damage. So perhaps Carroll
deserves to shoulder a share of the blame for
scrambling our ideas about entropy.
Entropy is typically thought of as a measure
of disorder or randomness, and it is bound up
with thermodynamics – the branch of physics
that deals with heat and mechanical work.
Its propensity to increase forever has granted
it exalted status as the pithiest answer to some
deep questions, from what life is to how the
universe evolved and why time moves ever
forward like an arrow. And yet just like
Humpty, entropy gets messy as soon as
you crack its surface.
For a start, there is no single definition.
But even if we understand it broadly as a
measurement or quantity, our current
conception of entropy doesn’t work to describe
the things it purports to, not least the universe.
“It’s all very confusing,” says Anthony Aguirre
at the University of California, Santa Cruz.
Now, Aguirre and others are going back to
the drawing board in search of a universally
valid version of entropy anchored in our most
fundamental theory: quantum mechanics.
They hope to put our understanding of the

universe’s mystifying directionality on firmer
footing – or nudge it off a wall.
We might even be in for something akin to
the Copernican revolution, when we realised
that Earth orbits the sun, rather than the other
way around. “That changed the way we view
the universe,” says Wojciech Zurek at the Los
Alamos National Laboratory in New Mexico.
“From then on, one could make connections
between phenomena that previously seemed
unconnected. It’s the same with the new way
of looking at thermodynamics.”
It all started in Carroll’s day, during the
industrial revolution, when Victorian
engineers were desperately trying to figure
out why their coal-powered steam engines
were so inefficient. Entropy was essentially
a mathematical way to quantify heat that
wasn’t available for doing useful mechanical
work, such as driving a piston. In the 1860s,
Rudolf Clausius defined it as the amount
of heat energy you could put into a system
without raising its temperature by a degree.
Ludwig Boltzmann soon made it a bit more
precise. He knew that the mechanical work
done by a hot gas like steam came from the
motion of the molecules, but he also
recognised that it was impossible to calculate
how every individual atom or molecule in a
given system moves. So he suggested working
with probabilities. Thus Boltzmann defined
entropy in terms of the number of different
possible ways in which molecules in a closed
system could be arranged. The more possible
arrangements, the greater the entropy.

Features Cover story

Free download pdf