New Scientist - USA (2019-12-07)

(Antfer) #1

36 | New Scientist | 7 December 2019


inside the system increases it elsewhere.
The second law emerged unscathed again,
but entropy changed. Bennett’s insight
revealed that it isn’t just about heat, or the
numbers of ways molecules can be arranged,
or work. Deep down, entropy seems to be
about information. This has some intriguing
implications for how information might
ultimately find use as a fuel (see “Running
on facts”, left). It has also raised new questions
about how information relates to the second
law and the big-picture processes of the
universe – questions that have forced
physicists to revisit their understanding
of entropy yet again.
A revision is long overdue, according to
Zurek. He has always been suspicious about
Boltzmann’s framing. The consideration of all
possible states was, Zurek says, “an inspired
ruse”: although it has been useful, there is no
real-world justification for it. When dealing
with finite systems such as an engine or a
chemical reaction, he reckons it makes no
sense to frame things in terms of the infinite
possible ways you can arrange molecules.
For Zurek, this is nothing short of a “fudge”
that has lulled us into a false sense that we
understand the behaviour of physical systems.
He suspects the reason Boltzmann’s statistical
tricks worked was because what we call entropy
is secretly something to do with quantum
physics. The quantum world is probabilistic,
with properties definable only in the statistical
terms that Boltzmann stumbled on. Hence
the idea that there might be something in
this most fundamental theory that gives rise
to Boltzmann’s version of entropy.

Quantum roots
And so Zurek has set out to reframe our
current, information-based conception
in terms of quantum physics. His scheme
centres on quantum entanglement, where
physically distinct systems have shared
properties that mean a measurement on
one can affect the outcome of a subsequent
measurement on the other.
Last year, he showed that it is possible
to derive thermodynamics by considering
quantum systems that are entangled with
their environment. Essentially, that means
a system’s entanglement determines the
amount and the nature of the available
information about its state, which gives
a measure of its entropy. It is a significant
step: rooting information and entropy
in quantum mechanics not only gives
new depth to our understanding of

how physical systems behave and interact,
but also promises to reinstate entropy as
a real measurable quantity.
Zurek is not the only one daring to ask hard
questions of the answer to almost everything.
Aguirre, together with his UC Santa Cruz
colleagues Dominik Safranek and Joshua
Deutsch, is also working on a new version,
again with information at its core. They call it
“observational entropy”, since it is designed to
take account of the amount of information
that can be gained when you perform a series
of measurements on a quantum system.
Intriguingly, the observational entropy of a
system will change depending on the way an

CR

ED
IT

“ Entropy is not


something that


has a fixed,


objective value


prior to being


measured”


An engine driven by information
is, quite frankly, hard to imagine.
And yet consider this: there is no
way to process information without
physical systems using energy.
This includes erasing information:
wiping a hard drive has an energy
cost. Turn this observation on
its head, and information starts
to look like a potential way to
fuel machines.
Imagine you have a device that
holds information in binary, 1s and
0s, and that it is blank (meaning it is
all zeroes). This is an ordered state,
much like the cold environment
of a heat engine – something
that converts thermal energy into
mechanical energy. “You can in
principle build a device that would
convert this state to the mixture-of-
1s-and-0s state,” says Christopher
Jarzynski, a chemist at the
University of Maryland. Such a
device would use, say, heat energy
to change this state, and the
conversion represents an
acquisition of information. The
physical act of the zeros changing
into ones and zeros could be
harnessed to do something
mechanical, like lift a mass against
gravity or charge a battery. “You
can extract work from a thermal
reservoir by the very act of writing
information onto a blank memory
slate,” says Jarzynski.
That could prove useful in
scenarios where there is no other
practical way to fuel a process.
Experiments have already backed
up that principle. Now the challenge
is to explore the possibilities,
creating machines fuelled by
information. That might be
something akin to the biological
machines that process genetic
information, or quantum-scale
sensors that use their information
intake to power motors or other
mechanisms.

RUNNING
ON FACTS
Free download pdf