Macworld - USA (2019-10)

(Antfer) #1
8 Macworld • October 2019

NEWS

Apple now appears to have finished its review
and has issued a statement apologizing for the
way this programme had been carried out so far.
The company plans to reinstate it later this autumn
after making some important changes.
The apology begins with a familiar statement:
“At Apple, we believe privacy is a fundamental
human right.” It then describes how it designed
Siri to protect your privacy – collecting as little
data as possible, using random identifiers instead
of personally identifiable information, never using
data to build marketing profiles or sell to others.
The statement goes on to make sure you
understand that using your data helps make Siri
better, that ‘training’ on real data is necessary,
and only 0.2 percent of Siri requests were
graded by humans. After all of this, Apple gets
round to the actual apology that should have
been in the first paragraph.
“As a result of our review, we realize we
haven’t been fully living up to our high ideals, and
for that we apologize. Apple will resume the Siri
grading programme this autumn, but only after
making the following changes:
“First, by default, we will no longer retain
audio recordings of Siri interactions. We will
continue to use computer‑generated transcripts
to help Siri improve.
“Second, users will be able to opt in to help Siri
improve by learning from the audio samples of their
requests. We hope that many people will choose to
help Siri get better, knowing that Apple respects

8 Macworld • October 2019

NEWS


Apple now appears to have finished its review
and has issued a statement apologizing for the
way this programme had been carried out so far.
The company plans to reinstate it later this autumn
after making some important changes.
The apology begins with a familiar statement:
“At Apple, we believe privacy is a fundamental
human right.” It then describes how it designed
Siri to protect your privacy – collecting as little
data as possible, using random identifiers instead
of personally identifiable information, never using
data to build marketing profiles or sell to others.
The statement goes on to make sure you
understand that using your data helps make Siri
better, that ‘training’ on real data is necessary,
and only 0.2 percent of Siri requests were
graded by humans. After all of this, Apple gets
round to the actual apology that should have
been in the first paragraph.
“As a result of our review, we realize we
haven’t been fully living up to our high ideals, and
for that we apologize. Apple will resume the Siri
grading programme this autumn, but only after
making the following changes:
“First, by default, we will no longer retain
audiorecordingsofSiriinteractions.Wewill
continuetousecomputer‑generated transcripts
to help Siri improve.
“Second, users will be able to opt in to help Siri
improve by learning from the audio samples of their
requests. We hope that many people will choose to
help Siri get better, knowing that Apple respects
Free download pdf