Following Apple's decision last week to suspend a Siri program that allows employees to listen to audio recordings for quality control purposes, Amazon and Google have both chosen to make their policies on human reviews of voice assistant audio more clear.
Late last month, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri, after a Guardian report revealed that contractors regularly hear private conversations recorded by Apple's voice assistant.
To allay privacy concerns, Apple said it was temporarily stopping the program while it reviewed the process that's currently used. It also said it plans to release a software update that will let Siri users opt out.
On Friday, Google said it had also suspended its policy of reviewing Google Assistant audio. The company actually suspended the practice across the EU on July 10 when a German privacy regulator started investigating it following a Belgian media report, but this is the first time Google has confirmed the fact publicly.
According to Bloomberg, Amazon will let Alexa users opt out of human review of their voice recordings. The new policy took effect Friday, and adds an option in the settings menu of the Alexa mobile app for removing recordings from analysis by Amazon employees.
All of the tech companies employ staff to review a small subset of voice recordings while claiming to anonymize the source. For example, Google distorts the recording before it is listened to, so as to disguise the user's voice, while Apple strips them of identifiable information and assigns each one a random device identifier.
However, Bloomberg revealed that some of Amazon's audio reviewers had access to the home addresses of Amazon customers, before the company moved to restrict the level of access. Many members of the public were unaware the practice even existed until Bloomberg reported on it earlier this year.