As we wrote back in April, in order to train and improve virtual assistants like Alexa, the companies behind those services tend to have employees or contractors manually review clips of conversations, for the sake of quality control.
A new report from The Guardian’s Alex Hern on Friday sheds more light on how Siri actually works. Hern spoke with an anonymous contractor who performs quality control on Siri, who said they were concerned about how often Siri tends to pick up “extremely sensitive personal information.”
According to The Guardian’s source, contractors working on Siri “regularly” hear recordings of people having sex, business deals, doctors and patients having private medical discussions, and even conducting drug deals with each other.
Whether or not Siri’s activation was intentional – and often it’s not, as the anonymous contractor said Siri can think the “zip” sound is a trigger – these Apple contractors are responsible for grading Siri’s responses. They do note whether the activation was accidental among other factors, like whether or not Siri’s answer was appropriate or helpful, or whether the question posed to Siri is something it should be expected to do.
We reached out to Apple about the issue, but we’ve yet to hear back. The company told The Guardian that “less than 1%” of daily Siri activations are looked at for review, and that no Siri requests are associated with Apple IDs, “under the obligation to adhere to Apple’s strict confidentiality requirements.” Yet, the whistleblower told The Guardian that Siri recordings “are accompanied by user data showing location, contact details, and app data,” which Apple might use to know whether or not a request was acted on.