Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes
Apple today announced that it will resume its Siri quality evaluation process in the fall with several privacy-focused changes.
Going forward, Apple will only gather audio samples from users who opt in to the grading program, and those who participate will be able to opt out at any time. And when a customer does opt in, only Apple employees will be allowed to listen to the audio samples, and the recordings will no longer be retained.
Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.
These changes come after The Guardian reported that Apple contractors "regularly" heard confidential information while grading anonymized Siri audio samples. Following the report, Apple suspended the grading program and began conducting a review of its process, and it has now apologized over the matter.
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:
• First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
• Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
• Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.
Prior to suspending the grading program, Apple says it reviewed less than 0.2 percent of Siri interactions and their computer-generated transcripts to measure how well Siri was responding and to improve its reliability, including whether the user intended to invoke Siri or if Siri responded accurately.
In its press release, Apple emphasizes its commitment to protecting user privacy and outlines how Siri adheres to that. The company does not use Siri data to build a marketing profile of any user, for example, and it also uses a random identifier to keep track of the data while it is being processed.
Apple has shared a new support document with more details on Siri privacy and grading.