New Scientist - USA (2019-12-07)

(Antfer) #1
7 December 2019 | New Scientist | 37

“Fundamentally, nothing is in equilibrium,”
he says. “The universe is certainly not.
In fact, almost every process in the universe
we care about relies on the universe being
out of equilibrium.”
Deffner reckons this undermines the
argument that the arrow of time comes from
entropy increasing. The two are equivalent, he
suggests: maybe we only see time flow because
things move inexorably towards equilibrium,
which is a process that increases entropy. “The
increase of entropy is just a mathematically
convenient reformulation of the universally
observed arrow of time,” Deffner says.
The prevalence of this sort of circular
reasoning is one reason that Aguirre is excited
about observational entropy, which doesn’t
make assumptions about equilibrium. “There
hasn’t been a quantum version of Boltzmann
entropy until we did this work, but we now
have a description of what the entropy of
that universe looks like. It goes up, too, so
that’s a good step towards thinking about
issues such as the arrow of time.”
Zurek points to practical benefits, too – not
least that quantum entropy will help us better
understand and exploit the properties of
quantum machines such as nanoscale
sensors and quantum computers. “This is an
emerging field that is of great importance to
nanotechnology and quantum information
processing,” he says. And if information really is
a resource to be treated like heat or mechanical
work, the insight might even give rise to an
array of technologies as revolutionary as those
that seeded the first industrial revolution.
“Maybe quantum [entropy] can do for us what
steam did for the Victorians,” says Deffner. ❚

University of Oxford. “The arrow of time is
one of them, but both the origin of life and
the expansion of the universe have also been
mentioned in the literature.”
The connection to life might seem odd. But
scientists have long puzzled over whether the
cellular mechanisms inside living organisms
can be seen as exploiting entropy. In recent
years, it has even been proposed that life might
have its origins in increasing entropy. The idea
is that the tendency of atoms to structure
themselves in a way that increases entropy
inevitably produces complex structures,
including living things. It is a speculative idea,
but a clearer picture of entropy’s true nature
may help put it to the test.
An equally thorny issue is the arrow of time.
The fact that time moves forwards, not
backwards, is reflected in the fact that certain
actions are irreversible – you can’t unscramble
an egg or unspill a cup of coffee. We often think
of this as the cast-iron rule, enshrined in the
second law, that entropy always has to increase.
The reasoning seems simple: there are more
ways to arrange identical molecules in a
scrambled egg than in the neat, ordered
situation where the yolk sits within the
albumen. But such a conclusion involves
questionable assumptions, says Safranek:
“In certain situations, it’s not clear which
state should be considered more ordered.”
Deffner agrees. People often assume that
disordered systems have more states, but that
isn’t necessarily true, he says. “You can easily
construct examples where the number of
possible states increases – increasing the
Boltzmann entropy – but the states come
in a very ordered and structured manner.”
What’s more, Aguirre says that many
attempts to apply entropy on a cosmological
scale are questionable because entropy as
currently defined applies only at near
equilibrium states, where the system has
settled into an unchanging configuration.

Michael Brooks is a consultant
for New Scientist. His latest book
is The Quantum Astrologer’s
Handbook

observer chooses to perform a sequence of
measurements. “It’s not something that has a
fixed, objective value prior to those
measurements,” says Safranek.
This, he explains, is because in quantum
mechanics, the properties of any object or
system are undefined until they are measured.
What’s more, the Heisenberg uncertainty
principle says that measuring one property
changes other, unmeasured properties – so the
order in which you make measurements will
affect the observational entropy in a system.
This is a serious re-casting of how we think
about entropy, but it still connects with the
classical concept, where the outcomes of
measurements are linked to probability
and possible configurations of the system.
It is early days for these ideas, and there is
much to work out. Nonetheless, the physicists
behind them hope that redefining entropy in
quantum terms can put our understanding of it
on firmer ground. You might wonder what there
is to gain. After all, no one is saying that the tried
and trusted second law of thermodynamics
no longer applies. But Aguirre is enthusiastic
about what this redefinition could mean.
“I believe that it will have significant pay off,”
he says, and he isn’t alone.
“The main hope is that quantum
thermodynamics might shed some new light
on the old problems,” says Vlatko Vedral at the


Conceived to improve
steam engines,
entropy is thought
to explain why time
moves forward

MA

AR

TE
N^ W

OU

TER

S/G

ET
TY
IM

AG
ES

NE

IL^ H

AR

VE
Y/G

ET
TY

IM
AG

ES
Free download pdf