Rotman Management – April 2019

(Elliott) #1
rotmanmagazine.ca / 63

Karen Christensen: You have said that the future of AI is
being built “by a relatively few like-minded people within
small, insulated groups.” Why is that such a big problem?
Amy Webb: In any field, if you have a homogenous group of
people making decisions that are intended to benefit everyone,
you end up with a narrow interpretation of both the future itself
and the best way to move forward. When we’re talking about a
transformational technology like AI, it involves systems that will
be making decisions on behalf of everyone, so it follows that a
lot of people are going to be left out of those decisions. The way
that these systems behave and make decisions and choices is
going to exclude a lot of people.


What is an artificial narrow intelligence system (ANI)?
This is what AI’s various ‘tribes’ are building. ANI’s are capable
of performing a singular task at the same level or better than we
humans can. Commercial ANI solutions — and by extension, the
tribe — are already making decisions for us in our email inboxes,
when we search for things on the Internet, when we take photos
with our phones, when we drive our cars and when we apply for
credit cards or loans.
They are also building artificial general intelligence (AGI)
systems, which will perform broader cognitive tasks because they
are machines that are designed to think like we do. The question


is, Who exactly is the ‘we’ that these systems are being modelled
on? Whose values, ideals and worldviews are being taught?

In recent years, Google launched unconscious bias train-
ing for its employees; yet at the same time, it was rewarding
bad behaviour among its leadership ranks. Talk a bit about
this paradox.
A single training program cannot solve the bias problem — just
as an MBA program offering a mandated ethics class doesn’t
stop unethical behaviour. When information gets siloed in that
way and is not more deeply integrated throughout a company,
people tend to dismiss what they learn as something to ‘tick off ’
on a checklist to meet requirements.
I do think the American members of the Big Nine — Ama-
zon, Apple, Google, Faceb o ok, Microsoft and IBM — recog-
nize that there are problems with diversity and inclusivity in
their ranks, which is why, in recent years, many have launched
unconscious bias training. Hopefully the goal of these programs
is not just to deal with employee behaviour — but to recog-
nize that biases are also creeping into the AI systems that they
are building.
It’s great to have these training programs, but wherever they
exist, we should expect to see substantive change throughout
the organization — not just changes in personnel, but also

Proceed with Caution


An NYU professor warns that Silicon Valley might be building


AI to make decisions without inclusive human values.


An interview with Amy Webb by Karen Christensen
Free download pdf