New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming

Apple is suspending a Siri program that allows employees to listen to ‌Siri‌ recordings for quality control purposes, reports TechCrunch.

Apple is going to review the process that's currently used, where workers listen to anonymized ‌Siri‌ recordings to determine whether ‌Siri‌ is hearing questions correctly or being activated accidentally.


Apple in the future also plans to release a software update that will let ‌Siri‌ users opt out of having their ‌Siri‌ queries included in this evaluation process, called grading.
"We are committed to delivering a great ‌Siri‌ experience while protecting user privacy," Apple said in a statement to TechCrunch. "While we conduct a thorough review, we are suspending ‌Siri‌ grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
The decision to suspend the program and offer an opt-out option comes following a report from The Guardian that shared details gleaned from one of the contractors working on evaluating ‌Siri‌ queries.

The employee expressed concern with Apple's lack of disclosure about the human oversight and said that contractors who work on the program have overhead confidential medical information, drug deals, recordings of couples having sex, and other private details from accidental ‌Siri‌ activations.

When The Guardian report came out, Apple confirmed that a small number of anonymized ‌Siri‌ requests are analyzed for the purpose of improving ‌Siri‌ and dictation. While Apple anonymizes ‌Siri‌ data that's evaluated for quality control and ‌Siri‌ improvement, its current privacy policy and security documents do not explicitly mention human oversight.

Top Rated Comments

(View all)

11 weeks ago
So long as it is actually anonymized, I don't really care
Rating: 40 Votes
11 weeks ago
Stuff like this should be Opt In not Opt Out.
Rating: 28 Votes
11 weeks ago
Thank god for whistle blowers and journalists. Imagine all the **** corporations would be able to get away with without having the public eye shine on their activities.
Rating: 26 Votes
11 weeks ago

So long as it is Apple, I don't really care

Changed that to reflect the sentiment of the ADL. Any other company reported on here doing this would have righteous, vitriolic hate thrown its way by the paragraph load.
Rating: 25 Votes
11 weeks ago
“When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.“

When I read their policies (after their new policy pages came out), I took it to mean that they reviewed them. I don’t understand the shock behind this. How else are they meant to be improved if they aren’t listened to by other people? I’m assuming the contact info is only used when you say “Hey Siri, call my dad”. If it takes that stuff constantly no matter what, then that’s a bit of a concern.

It’s good they’re letting people opt out of it. That should’ve been there from day one
Rating: 24 Votes
11 weeks ago

So long as it is actually anonymized, I don't really care

You might say something that identifies you. I don't know how they can anonymize this.
Rating: 18 Votes
11 weeks ago

You're forgetting that Siri has to be listening to EVERYTHING in order to respond to Siri requests.


You must be confused about how that works. A low power local circuit listens for the activation word, and only then activates the rest of the system. This has been talked about again and again.


Exactly, the processing of the trigger phrase happens on-device. The always on processor in the Mx motion coprocessor continuously analyzes the microphone output using a deep neural network acoustic model. If it think it hears “Hey Siri”, it wakes up the Ax processor and re-checks for the wake phrase using a more powerful and accurate algorithm.

If the wake phrase is confirmed by both checks, the first part of the audio data is further analyzed as the data is sent to the cloud for further processing. If a false activation is detected (e.g. “hey seriously”), the server sends a cancellation command and the device goes back to sleep.

There’s a lot more detail available at:

https://machinelearning.apple.com/2017/10/01/hey-siri.html
Rating: 14 Votes
11 weeks ago

You're forgetting that Siri has to be listening to EVERYTHING in order to respond to Siri requests.


You must be confused about how that works. A low power local circuit listens for the activation word, and only then activates the rest of the system. This has been talked about again and again.
Rating: 11 Votes
11 weeks ago
Well, when it comes to the couples having sex part....maybe there is just a surge in the popularity of the name Siri, and it gets yelled out loud alot....
Rating: 10 Votes
11 weeks ago
Just because you took their policy one way doesn't mean the average joe did too. Apple is a generally transparent and upfront company, ESPECIALLY when it comes to privacy. They should have been clearer that they listen to recordings, even if they did "nothing wrong."

This is not unlike throttle-gate. Basically do something sort of shady, get caught, have an opt out option, then get sued.
Rating: 9 Votes

[ Read All Comments ]