MacLife - UK (2020-02)

(Antfer) #1
learning, because all that
happens is that you show them
data, they learn what you want,
then they help you achieve it.
Neutral numbers go in, neutral
math happens, fair decisions
come out... surely?
But the data the algorithm was
based on wasn’t neutral. Data is
a record of what happened; what
happened isn’t always pretty.
Here’s an infamous example of
how real–world data creates non–
qhxwudo#rxwfrphv1#D#elj#whfk#Ľup#
created an algorithm for looking
over applicants’ CVs and deciding
who to interview. It began rejecting
ixoo|#txdolĽhg#zrphq#rxwuljkw1#
That’s because the data it was
trained with was the list of previous
applicants versus successful hires
when the decisions were made by
people... and those people had
prioritized men. When they created
the new algorithm, they wanted it
wr#Ľqg#zkr#kdg#wkh#uljkw#vwxļ/#vr#
it was set to look for what previous
candidates had in common. It
turns out the common factor was
being men, so the algorithm started
summarily rejecting women.
Goldman Sachs’ algorithm is
surely not “designed” to penalize
women, and I don’t doubt when it
says the system isn’t designed to
take gender into account, but that
doesn’t mean it isn’t basing

decisions on gender (or race, or
other factors) — it just doesn’t
know it is. Perhaps it was trained
on records of credit decisions made
by humans, and maybe those
humans gave women less credit.
The power of machine learning
means it can look at cross–
referenced data and identify that
some people should receive less
credit. It has no idea that what it’s
really doing is identifying women.
Through unthinking systems
like these we’ve allowed old biases
to reappear, but with the air of
respectability via programming,
like money laundering for
prejudice. So this isn’t really a
column about Apple, or even that
much about Goldman Sachs. It’s
a “public service announcement”:
be wary of trusting decisions made
by math you can’t question — there
will be more of it in the future.

A


FEW MONTHS ago, I was
worried about the Apple
Card having unpredictable
downsides, and it didn’t take
long for that to turn out to be the
case. It appears that when women
apply for it, they receive lower
credit than men, even if they have
higher credit ratings. Goldman
Sachs provides the credit service
iru#wkh#fdug/#dqg#lwv#rL·fldo#
response said a number of factors
are looked at, but not gender, race
and similar. Perhaps more telling
is that reportedly employees
blamed it on the algorithm.
As more companies entrust our
lives to algorithms, they’re often
careful to frame them as “neutral”
— because how can code be biased?
It’s just math. That goes double
for algorithms built by machine

MATT BOLTON is taking the opportunity of
the Apple card’s accidental controversy to
rant about the machines taking over


THE SHIFT



>>> Matt is the editor of Future’s flagship technology magazine T3 and has been charting changes at Apple since his student days.
He’s skeptical of tech industry hyperbole, but still gets warm and fuzzy on hearing “one more thing”.

Image rights: Apple.


Being able to apply for credit or sign up for
plans in apps is convenient, but the lack of
real people does have downsides.

To be honest, with the amount of money
Apple is sitting on, maybe it would be easier
if it just became a bank itself?

maclife.com FEB 2020 11
Free download pdf