AN UNCHARACTERISTIC PRIVACY
LAPSE BY APPLE
Jimmy Kimmel might have jokingly told Alexa
on-screen to stop recording what he utters, but
Amazon was apparently slow to take the hint.
It was not until August 2 that the company
introduced, buried behind various menus in
the Alexa app and on the Alexa website, an
option enabling users to prevent their Alexa
recordings being subjected to human review.
The How-To Geek website explains the
process for activating this setting.
By this time, Amazon wasn’t the only tech
company that had felt forced to clamp down
on permitting human workers access to users’
recordings. In July, a Guardian investigation
revealed that Apple contractors “grading”
the responses of the Siri voice assistant were
regularly hearing “confidential details” in
recordings. An anonymous whistleblower
working for Apple told the news site that such
details were often captured due to accidental
activations of the assistant.
In theory, Siri users shouldn’t be concerned as
long as they are careful about what they let
slip after uttering the phrase “Hey Siri”, which
wakes the assistant. In practice, however, Siri
can easily mistake innocuous sounds for this
“wake word”. For example, last year, the then-
defense secretary of the UK, Gavin Williamson,
was apparently interrupted by Siri during
a debate about Syria, and the whistleblower
revealed that Siri could often be triggered by
the sound of a zip.