Time - USA (2020-02-03)

(Antfer) #1
“the many issues with women’s voices,”
rather than, you know, fixing the many is-
sues with his voice-recognition software
that doesn’t recognize the voices of half
the human population.
But it’s also visible in more well-
meaning attempts to address gender bi-
ases. Many workplace initiatives aimed at
closing gender-pay and -promotion gaps
focus on fixing the women, assuming that
they, rather than systems that under-
promote them, are the problem. Women
need confidence training. They need to be
taught to negotiate for pay raises. Well, ac-
tually, the evidence suggests that women
are asking for pay raises as often as men—
they’re just less likely to get them. Perhaps
the issue here is not the women, but a sys-
tem that doesn’t account for gender bias?

There are reasons beyond fairness to
fix systems that are arguably primed to
over promote men: homogeneity is bad
for business. Even with the best will in
the world, a group of white middle-class
men from America are going to have gaps
in their knowledge, and they aren’t neces-
sarily going to know what those gaps are.
Which is how you end up with a “com-
prehensive” health tracker app that can’t
track your period. And I don’t believe
Apple hates periods; I believe that Apple
forgot periods exist.
The gender data gap and its default
male origins have been disadvantaging
women for millennia, but in a world where
we increasingly outsource our decision-
making to algorithms trained on data with
a great big hole in it, this problem is set to
get a lot more serious very quickly. And if
we don’t choose to correct the mistakes of

D


id you hear the one about
how aid workers rebuilt
homes after a flood—and
forgot to include kitchens?
How about the entrepre-
neur whose product was dismissed by
funders as too “niche”—but whose fem-
tech company, Chiaro, is now on track for
more than $100 million in 2020? Or the
female sexual- dysfunction drug that was
tested for its interaction with alcohol on
23 men... and only two women? Not find-
ing any of these funny? Maybe that’s be-
cause they’re not jokes.
From cars that are 71% less safe for
women than men (because they’ve been
designed using a 50th-percentile male
dummy), to voice-recognition technol-
ogy that is 70% less likely to accurately
understand women than men (because
many algorithms are trained on 70% male
data sets), to medication that doesn’t work
when a woman is on her period (because
women weren’t included in the clinical tri-
als), we are living in a world that has been
designed for men because for the most

We need to change our algorithms and Big Data
to include women BY CAROLINE CRIADO PEREZ

part, we haven’t been collecting data on
women. This is the gender data gap. And
if we want to design a world that works
for the woman of the future as well as it
works for the man of the present, we’re
going to have to close it.
Closing this data gap is both easy and
hard. It’s easy because it has a very simple
solution: collect sex-disaggregated data.
But it’s hard because the gender data gap
is not the product of a conspiracy by a
group of misogynistic data scientists. It is
simply the result of an everyday bias that
affects pretty much all of us: when we say
human, 9 times out of 10, we mean men.
Even when we try to fix gender dis-
parities, we still often end up using men
as the default—a tendency I have chris-
tened the Henry Higgins effect, after My
Fair Lady’s leading man who memorably
complains, “Why can’t a woman be more
like a man?” The Henry Higgins effect was
visible when an executive whose voice-
recognition system failed to recognize
women’s voices suggested that women
should undergo hours of training to fix

CLOSING


THE GENDER


DATA GAP


VIEWPOINT


CRUZ LÓPEZ: ILLUSTRATION BY AISTE STANCIKAITE FOR TIME

Free download pdf