Popular Mechanics - USA (2022-05 & 2022-06)

(Maropa) #1
popularmechanics.co.za

PH
OT


OG


RA


PH
Y:^ W


AR


NE


R^ B


RO


S;^
ILL


US


TR


AT
ION


:^ K
EIR


ST


EN


ES


SE


NP


RE
IS^


MAY / JUNE 2022 23

of chatbot is already a violation of a deceased person’s
autonomy – they have no say in which bits of their social
data go into the final chatbot, for instance. And creating a
chatbot modelled on a person who has never consented
in the first place feels unfair, because they aren’t a part of
the decision-making process.
On the one hand, Raicu says, much of this brand of
innovation is driven by people who do feel genuine
empathy and want to help others through the loss
of a loved one, perhaps. But at the same time, these
technologists must be astute in their designs, considering
the negative implications.
It may seem dystopian, and perhaps a bit paranoid, but
the only sure-fire way to protect your humanity from these
kinds of programs would be to set up a section in your
living will regarding your personal data, says Alexander
Hauptmann, a research professor at Carnegie Mellon
University’s Language Technologies Institute.
‘You could imagine that people might be able to put stuff
in their will about how their archive of data should be used
or disposed of,’ he says. ‘But then the other question is,
who is actually going to sue [the person who built the
chatbot]? Maybe some other family member who knows
what the will said and objects to it.’
For what it’s worth, we asked Microsoft about the
patent. While they didn’t tell us much, they did direct
us to a January 2021 tweet from Tim O’Brien, general
manager of AI Programs at Microsoft, in which he
confirmed that there are no active plans at the company
to use this chatbot patent.
‘But if I ever get a job writing for Black Mirror, I’ll
know to go to the USPTO website for story ideas,’ he
tweeted. Touché.

Naturally, this opens a whole can of worms, explains
Irina Raicu, the director of the internet ethics programme
at Santa Clara University’s Markkula Center for Applied
Ethics. ‘If you try to create a very good chatbot for someone
who died ... you could put words into people’s mouths that
they never said,’ she notes.
Taking a person’s tweets and Facebook posts, then
creating an index – or a sort of catalogue for the data to
help a computer search for the right answers to a query –
does not always lead to organic or honest responses.
‘If this becomes accepted, I think this could have a
chilling effect on human communications,’ Raicu says.
‘If I’m worried that anything I’m going to say could be
used in a weird avatar of myself, I’ll have to second-
guess everything.’ Using sarcasm on the internet,
for instance? You might not want to anymore, for
fear that your comments could be taken in earnest and
built into a chatbot dialogue, potentially harming your
reputation post-mortem.
This isn’t the first time an intelligent chatbot has been
created as a way to bring back the dead.
In 2015, technologist Eugenia Kuyda’s friend, Roman,
died in a sudden and tragic car accident in Moscow.
She gathered text message conversations between
Roman and many of his friends and assembled a chatbot
that could serve as a sort of analogue for him. In 2017, she
used that experience to launch Replika, an AI chatbot
service that allows anyone to make their own virtual friend.
Regardless of any positive effects, it raises an issue: While
these chatbots may be beneficial to the person who is
grieving, they may also be exploiting the dead, Raicu says.
In the case of the Microsoft patent, Raicu says that an
individual has a constitutional right to privacy, so this sort

STRANGER


THAN


(SCIENCE)


FICTION


Black Mirror, a popular
sci-fi anthology on
Netflix, seemingly
prophesied this
technology back in
2013 with an episode
titled ‘Be Right Back’. In
it, a woman signs up for
a chat service that lets
her communicate with

an AI version of her late
partner, who’d died in a
car crash. We won’t spoil
it for you, but suffice to
say, things get weird.
And then there’s the
2013 film Her, wherein
Joaquin Phoenix stars as
a lonely writer who dates
Samantha, an intelligent

operating system voiced
by Scarlett Johansson,
with troubling results.
While Samantha is not
a chatbot per se, the
film still illustrates the
psychological trauma
that can befall those
who lean too heavily
on technology.
Free download pdf