Time - USA (2020-02-03)

(Antfer) #1

81


the past now, we will blunder into a future
where we have literally coded them in.
Part of the problem is our blind faith in
AI. When tech entrepreneur David Heine-
meier Hansson complained to Apple that
his wife was given a limit for her credit
card 20 times lower than his, despite hav-
ing a higher credit score, he was informed
by workers at the company that it was
not discrimination, it was “just the algo-
rithm.” Having accepted that we humans
are hopelessly flawed and biased, we are
turning to artificial intelligence to save us.
But algorithms are only as good as the
data we feed them, and when it comes
to women, that data is practically non-
existent. Worse, algorithms amplify our
biases back to us. One University of Wash-
ington study found that when an algorithm
was trained on an image data set where
pictures of cooking were 33% more likely
to feature women than men, the algorithm
increased the disparity to 68%. That is,
pictures of men were labeled as female
just because they were in front of a stove.
Labeling a man as female may not feel
like an egregious example of algorithmic
bias, but imagine an amplified bias like
that let loose in hiring. This has already
happened to Amazon, which had to aban-
don an AI program after it favored men
over women for suggested hiring. And
that’s just one algorithm that we know


Women are not
a confounding
factor to be
eliminated

about: 72% of CVs in the U.S. never
reached human eyes as of 2016, and robots
trained on the posture, facial expressions
and vocal tone of “top-performing em-
ployees” have been introduced to inter-
view processes. Are these top-performing
employees gender- and ethnically diverse,
and if not, has the algorithm accounted for
this? We often don’t know because most
algorithms are protected as proprietary
software, but the evidence isn’t promising.
Even more concerning is the intro-
duction of AI into medical diagnostics,
where the data gap and a male-biased
curriculum already leaves women 50%
more likely to be misdiagnosed if they
have a heart attack. And yet there is lit-
tle evidence of developers’ accounting for
this bias. A recent paper detailed an algo-
rithm that was intended to predict heart
attacks five years before they happen: it
was trained on heavily male-dominated
studies, even though we know there are
major sex differences in risk factors for
cardiovascular disease such as diabetes
and smoking. So will this AI predict heart
attacks in women? It’s impossible to say,
because the paper doesn’t include enough
sex-disaggregated data.
There are solutions to these problems
if we choose to acknowledge them. A 2016
paper on “word-embeddings” (learning
techniques that are essential for search
algorithms) explained a new method-
ology that reduced gender stereotyping
(e.g., “He is to doctor as she is to nurse”)
by over two-thirds, while leaving gender-
appropriate word associations (e.g., “He is
to prostate cancer as she is to ovarian can-
cer”) intact. The authors of the University
of Washington image-labeling study de-
vised a new algorithm that decreased bias
amplification by 47.5%. But these exam-
ples are very much the exception.
If we want to design a just future,
we must acknowledge—and mitigate
against—this fundamental bias that
frames women as atypical. Women are not
a confounding factor to be eliminated from
research like so many rogue data points.
In this new world where data is king (and
I use that term advisedly), it’s time for us
to finally start counting women as the
entirely average humans that they are.

Criado Perez is the author of Invisible
Women: Data Bias in a World Designed
for Men

INVENTING


THE FUTURE


Xóchitl Guadalupe
Cruz López
LOCATION: Chiapas, Mexico
INVENTION: Warm Bath

W


hen she was 8,
Xóchitl Guadalupe
Cruz López lived
in a home that was often
without hot water. The same
was true for many other
residents of San Cristóbal
de las Casas. “People here
have to take baths with
cold water. They have a lot
of respiratory diseases,”
she told TIME through an
interpreter. “I wanted to
do something.” So Xóchitl
created Warm Bath, a solar-
powered water heater made
of easy-to-get recycled
objects, including water
bottles, plastic connectors
and rubber hose. It costs
about $30 to assemble.
Xóchitl made Warm
Bath with the National
Autonomous University
of Mexico’s adopt-a-talent
science program, PAUTA.
In 2018, she was the
first child to receive the
university’s Institute
of Nuclear Sciences’
Recognition for Women
award. Now 11, Xóchitl
plans to apply for a patent
this year.
ÑConstance Gibbs

publishes weekly
magazines for U.S. elementary
and middle school students

ILLUSTRATION BY ANNA PARINI FOR TIME

Free download pdf