Josh Norem
Explores the
Dangers of Tap Water
I
f you just built a water-cooling rig and filled it with tap water instead of
using that “fancy-colored water” you might want to rethink your posi-
tion. While the debate is still raging about distilled water vs. Prestone
vs. Water Wetter, one thing everybody seems to agree on is that tap
water is just plain bad for water-cooling setups.
First things first: Tap water is full of impurities. Though tap water
is filtered for human consumption, it still contains trace elements
and minerals that won’t harm you but will react with the metal in
your rig’s water blocks, causing corrosion. To demonstrate this I
placed pennies and thumbscrews in a couple of beakers. Then I
filled one beaker with tap water and the other with distilled water,
which is almost 100 percent pure.
The results speak for themselves. Obviously, the pennies and
thumbscrews in the tap water suffered extreme corrosion while the
beaker of distilled water looks practically unaffected. Imagine what all
that sediment running through your blocks, radiator, and pump would
do to your rig!
Distilled water isn’t perfect, however. While there’s no vis-
ible corrosion, you can see minute signs of biological growth. I
asked Tim Hunting from
Koolance what, aside
from distilled water,
should be used in a
water-cooling setup and
this was his response:
“Performance-wise,
straight water will provide
the best temperatures.
The more glycol, alcohol,
or whatever else a user
might be mixing in, the
lower the coolant’s heat
capacity or thermal con-
ductivity will be. Despite
this, some amount of
anti-corrosion and anti-
biological agents are
needed if impurities make
it through the distillation
process or are introduced
by the cooling compo-
nents themselves.”
So, what’s the solu-
tion? Ask any vendor
and they’ll recommend
their very own premixed
solution, which typically is distilled water mixed with anti-algae,
corrosion inhibitors, and some form of biocide. If you’re looking to
create your own DIY blend, we recommend playing it safe by using
no more than 20 percent additives to maintain high flow rates and
good temperatures.
Gordon Mah Ung
ENABLES SLI ON
INTEL’S 955X
Why can’t Intel and nVidia just be friends?
62 MA XIMUMPC OCTOBER 2005
in the lab REAL-WORLD TESTING: RESULTS. ANALYSIS. RECOMMENDATIONS
We don’t see
any corrosion
in our beaker of
distilled water,
though we can
see a little bit
of biological
growth taking
place.
Proof: With older drivers, you can enable
SLI on 955X boards, even though it won’t
run games.
Yuck. The
impurities in
tap water are
launching an
all-out assault
on the copper
plating of sev-
eral pennies!
I knew tap water was bad for water cooling,
but I didn’t know how bad
I
f you’ve read this month’s Head2Head (page 16), you know how
frustrated I am over the situation regarding dual PCI-E graphics
cards. If nVidia used Intel’s Xeon CPU and chipsets as a develop-
ment platform for SLI, why the hell won’t two nVidia graphics cards
work in tandem on an Intel 955X board with two x16 slots, such as
Asus’ P5WD2 motherboard?
It begs the question: Is this a technical problem or a marketing
problem? Intel declined to comment on the SLI/955X situation and
nVidia said it’s “still working” with Intel on the matter. Could I make
SLI work on the P5WD2 without the vendors’ blessing?
Mebbe. I installed a pair of nVidia GeForce 6800 cards in the
P5WD2, and with the current 77.72 Forceware drivers, both cards
worked! But sadly, the option to enable SLI wasn’t available. I won-
dered if pre-955X drivers might be the key. After digging up the
66.72 drivers, I rebooted the system and crossed my fingers. And
voilà! The older drivers prompted me to enable SLI. The only prob-
lem was that anytime I fired up a 3D game, it locked. Bah!
It’s clear to me
that the problem is
more marketing than
technical. I don’t
think nVidia wants to
give SLI to Intel, at
least not for free. And
with a long history of
antagonism between
the two companies,
the stalemate isn’t
likely to end any time
soon.
So here’s how it
stands: Instead of
picking a mother-
board and having any
graphics option open
to you, you now have
to pick your graphics
cards and tailor your
motherboard around
it. Even worse, you
can’t switch graphics
should the performance lead switch, from nVidia to ATI or vice
versa. The situation, in a word, sucks.