2019-09-01_Computer_Shopper

(C. Jardin) #1

KAY’SCORNER


10 SEPTEMBER 2019 |COMPUTERSHOPPER|ISSUE 379


Fakesandpains


THECONCEPTOF‘fake news’ is
one that crops up with increasing
regularity;the term even made it
as word of the year in 2017.Fake
news is where someonemakes
somethingup,usually something
scandalous,often just to get you
to click on the headlineor watch
the ‘news’ item. More seriously,
it’s increasinglyused to damage
someoneor some organisation.
Of course,you might well point
out that fake news isn’t really
that new –think back to
headlinessuch as ‘Freddie Starr
atemyhamster’–but it seems
to have becomemore
widespreadand harder to spot.
I’ve always been afan of
Edgar Allan Poe’scomment:
“Believe nothingyou hear,and
only one half that you see.”
It’s agood rule foranything
that’s online –swappingthe
‘hear’for ‘read’,and probably
reducingthe ‘half’tosomething
alot smaller.This mottoshould
probablybe stuck as asticky
noteonevery monitor and

smartphonescreen –maybe we
could get abylaw passed?
If you’re alreadycynical
about what you see and read
online,the bad news is that it’s
all going to get alot worse
becauseof the rise of GANs,
or GenerativeAdversarial
Networks.AGAN uses machine
learning,an artificialintelligence
techniquewhere rather than
writingspecificinstructionsfor
acomputer (if this happens,
do that), the computer program
is shown lots of data with
specificpatterns –maybe lots of
picturesof animals,some of
which are dogs, forexample.
It can learn what adog looks
like,then when it’s shown new

data it can identifydogs even
when it hasn’t previouslybeen
told ‘this is adog’.
What makes aGAN different
to other machine-learningtools
is that, having been trainedto
recogniseacategory of data –
picturesof dogs, say–itcan
then generatenew examples,so
could come up with apicture of
atotally new breed of dog that
doesn’t actuallyexist. Fake news!
This then leads on to the
idea of adeep fake,where a
GAN is used to combineand
superimposeimagesand
videos on topofand into
other imagesand videos,so
you might have avideo of
Freddie Starr ‘actually’eating
the proverbial hamster,even
thoughthat never happened.

BRAINTRUST
The man who invented the idea
of aGAN, Ian Goodfellow,has
until recentlybeen aresearch
scientistat GoogleBrain, a
researchteam at Googlethat

experimentswith combinations
of machinelearningand
systems engineering.If you’re
thinkingthis all soundsvery
theoretical,GoogleBrain
projectshave looked at message
encryption,and some of the
work has been incorporated
intoGoogle Translate.
Goodfellow has recently
moved on to Apple,where he’s
now director of machinelearning
in the SpecialProjectsGroup.
This group is best known fora
machine-learningframework
used in Siri. These ideas aren’t
sciencefiction, they’re being
used right now.
Youcan playaround with
apps that use some of the

machine-learningtechniques,
althoughthey’re not as
sophisticated as the large-scale
GAN applications.Forexample,
Facehub is amobile app that can
swap your face foranother one
of your choosing,so as you
speak, laugh and make faces,
your speech,expressionsand
laughter appearsto come from
whoever you’ve chosen.
There’s also Lyrebird, which
can be used to synthesise
anyone’s voice given asample
of the originalperson speaking.
So you could be listening to,
forexample,Marilyn Monroe
saying how it was her who ate
the hamster,and it would sound
completely realistic.

PHOTO FIT
What has recentlymade me
even more suspiciousof
online videos is the news that
researchersat Samsunghave
come up with arealistic ‘video’of
someonetalking based just on a
single photo. It’s bad enough
trying to spot the joins where
videos have been merged,never
mind them just being computer-
generated. This researchis
anotherexampleof aGAN in
use.The programtakes some
photos of someone,then uses
them to createavideo of that
person talking.So it could take a
photoofyou, and producea
video of you admittingyou were
with Freddie Starr and that you
atethe hamster.
The developerssaythat the
system could be used to show
“highly realisticand personalised
talking head modelsof new
people and even portrait
paintings”.Good grief, at this
ratewe’ll be seeing aMona Lisa
video where she admits it was
her who atethe hamster.
So my advice is this: just
becauseyou can hear,read or
see avideo of somethingon the
web,itdoesn’t mean it’s real.
Freddie Starr never did eat that
hamster,and Elvis is yet to be
seen on the moon.

KAYEWBANK
Software guru andShopperlegend
[email protected]

Worriedabout theriseoffakenews? Youain’t seen nothingyet,saysKayEwbank,as


shedelvesintothe murkyworld of deep fakes(at least,we thinkit’sher)


Whathas recently mademe even moresuspiciousof onlinevideosis


the news thatresearchersatSamsunghave come up witharealistic


‘video’of someonetalkingbasedjustonasinglephoto

Free download pdf