HWM Singapore – June 2019

(lily) #1

needto storedatain the cloud.Thisis
differentfromsimplyhavinga local
modelon yourphone,and it goesa
step furtherbecausethe trainingis
happeningon yourdeviceas well.
Google’sGboardkeyboardcan
learna new wordlike “Targaryen”
or “BTS”,withoutit knowingwhat
you’retypingat all. Becauseso many
peoplehavetypedthe word,the
sharedpredictionmodelhas been
updatedto reect that.Yourdevice
basicallydownloadsthe current
model,lets it learnfromdataon your
phone,and thensummarizesthe
changesas a small,focusedupdate.
It is this update,and not yourdata,
that is thenencryptedand sentto the
cloud,whereit is averagedwithother
userupdatesin orderto improvethe
sharedmodel.Thismeansthat there
are no identifyingtags or individual
updatesin the cloud,and all the
trainingdatastayson yourphone.


pluggedin, and on a free wireless
connection.But with the right
implementation, federated learning
can be immensely useful, enabling
smartermodels, lower latency, and
lowerpower consumption, while at
the sametime ensuring privacy.

Thatsaid,therehavebeenbig
hurdlesto overcome.For example,
trainingcan’thappenall the time,
and on devicetrainingrequiresa mini
versionof TensorFlow,in additionto
carefulschedulingto ensuretraining
onlyhappenswhenthe deviceis idle,

Trainingonlyhappenswhenthedeviceis
idle,soperformanceisn’taffected.

A distributed
machine learning
approach using
decentralized
data residing on
end devices.

PERSONALIZATION SLEEPING


FEDERATED LEARNING


JUNE 2019 | HWM 81
Free download pdf