2020-03-14_New_Scientist

(Grace) #1

42 | New Scientist | 14 March 2020


from smart fridges and fitness watches
to smart meters and factory monitoring
equipment, all pumping out data. This so-called
internet of things spanned 9.5 billion devices
in 2019 and is due to reach 28 billion by 2025.
In 2017, Huawei researcher Anders Andrae
predicted that the tsunami of data crunching
resulting from all this would consume a fifth
of the world’s electricity by 2025. That figure
understandably made headlines. His latest
predictions are less apocalyptic, but still
suggest that more than 10 per cent of the
world’s electricity could be devoted to
information processing by 2030. In terms
of raw power, that would be more than is
currently used by the whole of the EU. Such
growth is likely to represent a significant future
source of carbon emissions, and keeping
data centres from overheating will require
unsustainably vast quantities of cooling water.
Not everyone believes such a dramatic
energy crunch is coming. “Andrae’s models

When the chips are down


“ We might be able to


design computers that


are three or four orders


of magnitude more


energy-efficient”


All modern computers encode
information the same way. They
all use transistors that act as binary
switches to control the flow of
electrical current. Keeping these
components operating in a way
that allows us to reliably distinguish
between their on and off states takes
a certain amount of electrical power,
which leads to waste heat. This
restricts how closely we can
pack transistors on computer
chips without serious overheating
problems and limits energy-
efficiency gains (see main story).
One way to solve the problem is to
change the physical properties of the
hardware involved. At present, all
modern computing is built on silicon.
Within this material, the minimum
power required for transistors to
work effectively is relatively high.
Using new materials like germanium,
carbon nanotubes, graphene or
strange conductive-magnetic
substances called ferroelectrics could
help bring this down. There are also
proposals to use a quantum effect
known as quantum tunnelling to
ease power requirements.

Even more exotic proposals
involve changing how we use
electrons in computer processors.
Rather than relying on the charge
of these particles to encode
information, we could instead use
other characteristics, like a quantum
property they possess known as
spin. Alternatively, we could bypass
all these problems by replacing
electronics with optical circuits
that run on light.
None of these approaches are
ready to replace silicon transistors
yet, says Eli Yablonovitch at the
University of California, Berkeley.
It is possible that more radical
alternatives are needed. Quantum
computing, which uses the strange
properties of quantum physics to
manipulate information in entirely
new ways, could drastically speed
up cryptography, for example, while
analogue computers that encode
information continuously rather
than as binary bits could prove
useful for applications like artificial
intelligence. At the moment, though,
neither look set to replace general
purpose computers.

are pretty simplistic frankly,” says Eric Masanet
at Northwestern University in Illinois. He
says they extrapolate from older studies of
computing’s energy use, an approach that has
historically led to overestimates. In a recent
study, Masanet and his colleagues found that
the energy use of data centres only increased
by 6 per cent between 2010 and 2018 despite
a 550 per cent increase in their workload.
That was thanks to improvements in
hardware as well as energy management,
but Masanet thinks further efficiency gains
will be required. “We need to start paying more
attention to the potential for rapid growth
in energy use and we need to start doing what
we can to avert that,” he says.

Turning down the heat
That’s precisely the motivation behind a
host of alternative technologies hoping to
exploit new materials or innovative means
of manipulating data (see “When the chips
are down”, left). But experts agree it will be
many years before any provide sufficient
improvement to head off the problems
faced by computing.
A more radical approach may now be
emerging. A growing number of researchers
are revisiting Landauer’s calculations and
using new tools to dramatically expand our
understanding of how thermodynamics
and computing interact. For Jim Crutchfield
at the University of California, Davis,
computing’s situation has parallels with the
industrial revolution. In the 18th and 19th
centuries, engines and pumps were built to
convert heat into mechanical energy long
before physicists formalised the principles
of thermodynamics. When those principles
were better understood, the gains in efficiency
and power were astonishing.
“We’re in the information age and we’re
kind of in the same conceptual situation,” says
Crutchfield. “The general claim is that if we
understand these trade-offs, we’re going to be
able to design computers that are three or four
orders of magnitude more energy-efficient.”
The opportunity arises from what Landauer’s
work left out. For all its profundity, says David
Wolpert at the Santa Fe Institute in New
Mexico, it could only establish a maximum
efficiency, not set out a road map to getting
there. In part, that was due to the limited
mathematical tools available at the time. They
could only describe systems in equilibrium,
where no energy enters or leaves, which is a
massive oversimplification. “Computational
systems are, if nothing else, extraordinarily
non-equilibrium,” says Wolpert. They are in
Free download pdf