2020-08-01_PC_Gamer_(US_Edition

(Jacob Rumans) #1
I

ntel loves a good codename. Who remembers
Dragontail Peak? Or Lizard Head Pass? Or even
2008’s White Salmon? Great days. All of those
refer to motherboards, but Pohoiki Beach is
different—it’s a new way of building computers
that’s based on the human brain. Neuromorphic
computing—literally ‘nerve shaped’—uses insights from
neuroscience to create chip architectures.


By simulating the way human brains work in silicon,
calculations can be carried out faster while using less energy.
The training of neural networks can be carried out more
efficiently too, with only one viewing of an object necessary for
the net to recognize it forever.
Mike Davies, director of Intel’s Neuromorphic Computing
Lab, sees it more clearly, “Neuromorphic computing entails
nothing less than a bottom-up rethinking of computer
architecture,” he says. “The goal is to create chips that
function less like a classical computer and more like a human
brain. Neuromorphic chips model how the brain’s neurones
communicate and learn, using spikes and plastic synapses
that can be modulated based on the timing of events. These
chips are designed to self-organize and make decisions in
response to learned patterns and associations.”
Which all sounds a bit Cyberdyne, but we’re sure this will
be fine. The goal is that one day neuromorphic chips may be


able to learn as fast and efficiently as the brain, which still far
outperforms today’s most powerful computers. According to
Intel, neuromorphic computing could lead to advancements in
robotics, smart city infrastructure, and other applications that
require continuous learning and adaptation to evolving data.
“The inspiration for neuromorphic computing goes back to
the earliest days of computing itself,” says Davies. “If you look
at the early papers by John Von Neumann or Alan Turing, they
actually talk about neurones and synapses, because back in
the ’40s they hadn’t invented the terminology of conventional
computing. The brain was the one example they had.”
And while classical computing has been solving problems
for 80 years, mother nature has been at it for billions, and has
got quite good at making brains. “If you look at the human
brain,” says Davies, “it operates at 20W. Everything we do—
simultaneously processing data streams, coming up with new
ideas and insights—all that is being done at just 20W of
power”. For context, a Raspberry Pi 4B pulls 7.6W under load,
but an i7-7700K can draw 77W while doing a bit of light
gaming. Now, the i7’s Kaby Lake cores probably do a bit more
work per second than the Cortex-A72s powering the Pi, but it
goes to show how power efficient the human brain is. Also it
runs on glucose and doesn’t need a fan and heatsink
arrangement to be bolted to the top.
The 770 processors in Pohoiki Springs—which is the
second generation of the technology after Pohoiki Beach—are

In 2018, one
percent of all
electricity
consumption
worldwide went
on data centers.
It’s not just the
computing needs
that take all that
juice, but the
cooling too. Part
of the industry’s
response to
these large
energy bills is a
move toward
ARM processors.
Neuromorphic
chips could cut
that cost further,
“Reducing power
is a really big
deal, and we’re
finding very big
gains in the
energy required,”
says Davies.

Power
down

NERVES OF STEEL


How Intel is taking inspiration from our neurons for next-gen chips


A single board from
within the Pohoiki
Springs box. Intel
hasn’t given details
about how the
multiple chips, or
their boards, are
connected together.


TECH


REPORT


Ima

ge (^) c
red
it: (^) In
tel

Free download pdf