Maximum PC - USA (2022-06)

(Maropa) #1

JUN 2022MAXIMU MPC 13


TRADE CHAT


Jeremy Laird


© NVIDIA


The problem with


the PC is power...


AS THE FANS INSIDE my Intel eight-core laptop spooled up to their customary


mini-hurricane mode, it reminded me that the PC has a major problem


with power efficiency. I had a couple of Chrome windows open running 10


tabs each. That’s pretty much all it takes for my laptop to freak out now.


The ever-increasing power
consumption of the latest CPUs
and GPUs cannot continue.

Admittedly, this laptop is powered by Intel’s 11th
gen rather than 12th gen CPU architecture. But
11th gen chips are still built on Intel’s latest 10nm
production process. And if there’s one thing that
12th gen Alder Lake sucks at, it’s power efficiency.
Give the 12900K desktop model a hammering and
it guzzles as much as 300W while hitting 100°C.
Not that Intel is alone in producing silicon power
hogs. Nvidia’s Geforce RTX 3090 can breach 350W,
while the new 3090 Ti can hit over 500W. AMD’s
current graphics boards are little better. The
Radeon 6900 XT will happily suck down 325W or so.
Next-gen GPUs due later this year are expected
to be even worse. The Nvidia GeForce RTX 4090
is rumored to be a 600W board. A mooted 4090
Ti variant could possibly come in above 800W.
Now imagine a scenario with a maxed out 3090
Ti running in the same rig as an Intel 12900K
CPU also under full load. The total system
power consumption would be getting close to the
recommended 1,440W limit for continuous power
from a standard US wall socket.
It’s the sort of power consumption that gets
you thinking about environmental issues, not to
mention cost issues. I mean, have you seen oil
and gas prices lately? It’s a parlous state of affairs
given chip manufacturing processes continue to
advance. That next-gen Nvidia 40 series GPU is
expected to be produced on TSMC’s latest 4nm
node, for instance. It’s hard to understand how
things have gotten so bad.

It doesn’t have to be this way—
and, yes, I’m going to mention
Apple silicon. The latest MacBook
M1 Air makes every x86-powered
PC laptop look ridiculous. Its
single-core performance is as
good as anything this side of Intel
Alder Lake. With four performance
cores and four efficiency cores, the
multi-threaded performance isn’t
bad either. It does this in a slim
chassis with no active cooling that
barely gets warm or throttles.
It uses about 30W totally strung
out, but that’s the whole laptop, not
just the M1 chip. As for the higher-
performing M1 chips, they are just
as impressive. The M1 Max fitted
to a 16-inch MacBook Pro hasn’t
quite got the multi-threaded grunt
of, say, an Intel Core i9-12900HK
mobile CPU. But it’s competitive in
most benchmarks. And if there’s a
12900K-powered laptop that has
even half the battery life of the

MacBook, I haven’t seen it. Most
have a third or a quarter of the
battery life, or even worse.
At the core of the problem on the
CPU side of the equation is the x
ISA. There’s no avoiding the fact it’s
both ancient and wasn’t originally
conceived with power efficiency in
mind, unlike the ARM instruction
set used by Apple silicon that was
all about efficiency. Of course, until
recently, that focus on efficiency
has seemingly prevented an ARM
chip from competing on pure
performance. Clock-for-clock, in
the single-threaded workloads
the cores in M1 are more powerful
even than Alder Lake. If Apple
starts cranking up the clocks and
chucking a load of cores in a future
Mac Pro, it should be interesting.
Anyway, that’s not to say Apple
is going to take over the world
and replace our beloved PCs with
Macs. But the PC does have a
power problem that desperately
needs solving. The ever-increasing
power consumption of the latest
CPUs and GPUs cannot continue.
If x86 can’t up its game, maybe
it’s finally time that long-mooted
transition from x86 to ARM for the
PC actually happens.

Nvidia’s new RTX 3090 Ti inhales a
ludicrous 500 watts

Six raw 4K panels for
breakfast, laced with extract
of x86... Jeremy Laird eats and
breathes PC technology.
Free download pdf